SQUIZZ.com
Loading interface, please wait...
Interface taking a long time to load?
Check your internet connection is up, and you're using the latest app/browser version.

Adaptor Generic (version 1)

Prerequisites

Please make sure you have read through the Get Started/Overview before continuing down this document.

Also make sure you understand the concepts of:

Overview of Data Connectivity

The generic adaptor provides five different ways to export and import from different data sources. The supported data source types are:

  • CSV Spreadsheet Files
  • Web Services that support Ecommerce Standards Documents
  • MS SQL Server Databases
  • ODBC Compatible Databases
  • XML, JSON and CSV files stored locally, or from HTTP web services, or in the Connector itself
  • SQLite Databases (version 3)

Generic Adaptor System Diagram

Within the Generic adaptor multiple data sources can be set up that each export or import data for a given data source type. For example a data source could be set up that reads product data from a spreadsheet file that is saved in the CSV file format and exports the product data to an Ecommerce system.

An unlimited amount of data sources can be created within the Generic adaptor. Each data export and data import within the adaptor can then be assigned to one of the data sources set up. This allows different data to be obtained from the different data sources. For example the product data may be obtained from the CSV spreadsheet file, where as the customer account data may be obtained from a MS SQL Server database table.

Data Source Type: JSON/XML/CSV - Webservice/File

Within the Generic adaptor data sources can be set to read and write JSON, XML or CSV text data from a compatible web service, from files stored locally on a computer, or from data stored directly in the adaptor. This allows cloud based platforms and business systems that support querying JSON, XML or CSV data over HTTP to have data read from or pushed into by the Connector. Additionally the "JSON/XML/CSV - Webservice /File" data source type can read from JSON, XML or CSV files saved to the filesystem that the Connector is installed on.

Since JSON and XML data structures are hierarchical based, the data source type can be configured to read through the data by matching on objects that meet certain criteria within the tree, these objects then act like table rows where JSON or XML data can be extracted out from each found object's data values, or child objects, or parent objects. SQL-like functions can be used to manipulate and combine JSON or XML data for each data field in any given data export, which allows for highly customised querying of data in a relational database like fashion. CSV text data can also be converted into a JSON data structure, allowing the same advanced querying capabilities as with the JSON and XML data structures.

If XML data is being queried, then it will be first turned into a JSON data structure, which then allows the data to be queried using JSON notation. Built into the Connector application is the JSON/XML/CSV Query Browser that allows testing and configuration to be performed, which can be very helpful when looking to set up a JSON/XML/CSV - Webservice/File data source type within a Generic adaptor.

If CSV text data is being queried, then it will be first turned into a JSON data structure, which then allows the data to be queried using JSON notation. The data soure will read the first row of the CSV text data to find the names of the column headers, then use these to name the properties of each JSON object retrieved for each subsequent row  By also using the JSON/XML/CSV Query Browser it allows testing and configuration to be performed with CSV data, which can be very helpful when looking to set up a JSON/XML/CSV - Webservice/File data source type within a Generic adaptor. Note that only CSV data saved with UTF-8 or ASCII character encoding is supported by the adaptor.

When JSON/XML/CSV data is being written, the Generic adaptor will first build an JSON data structure, then if configured it will convert the JSON into an XML document that can be saved as a file to the filesystem, or else sent across to a web service using the HTTP/HTTPS protocol.

JSON/XML/CSV - Webservice/File Data Source Type Settings

Below are the settings that can be configured for each JSON/XML/CSV - Webservice/File data source type.

Setting Description
Data Source Label Label of the data source that allows people to recognise the type of data or location where data is read or written to.
Protocol Sets the protocol that determines if data should be sent and from other web services using the HTTP web protocol, or read and written to files locally. Set to one of the following options:
  • http://
    JSON, XML or CSV data is sent to or from a web service through a public (the internet) or private (intranet) computer network using the http protocol. The data will not be encrypted and any computers passing the data along may be able to see or store the contents of the data. Avoid using this protocol when sending or receiving data across the public internet.
  • https://
    JSON, XML or CSV data is sent to or from a web service through a public (the internet) or private (intranet) computer network using the http protocol. The data is securely encrypted and any computers passing the data along will not be able to read the contents of the data. Use this protocol when sending or receiving data across the public internet, such as from cloud based business systems. If using this protocol in internal computer networks ensure that security certificates are installed on the computer that the Connector is installed on, other connection issues may occur.
  • file:///
    JSON, XML or CSV files will be read or written directly to the file system.
Host/IP Address

If the Protocol setting is set to http:// or https:// then set the IP address or website domain where the JSON, XML or CSV data is being read from or written to. Additionally you can specify any root folders that are common to the web service, which will be prepended to each data export/imports URL.
If the Protocol setting is set to file:/// protocol then set the file system drive and root directories where JSON, XML or CSV files may be stored. The path does not need to be the full folder path since this setting will be prepended to each data export/import URL.

Example values:

  • 192.168.0.22/examplews/endpoints/
    Address to a web service running on a computer within an internal network
  • www.squizz.com/rest/1/org/
    URL to a public cloud based web service available on the internet.
  • C:/data/xml-files/
    Directory path on the local file system's C drive where files may be stored
HTTP Request Headers

If the Protocol setting is set to either http:// or https:// then set the headers that will be placed in all data export/data import http requests made to the web service to obtain or push data. Common headers include:

Content-Type: application/json
Authorization: Basic dXNlcm5hbWU6cGFzc3dvcmQ=
Accept-Encoding: gzip

Data Type

Specifies the format of data being sent out as in data imports or read in as with data exports from either a web service or file. Set to either:

  • JSON
    The data will be written or read as JSON
  • XML
    The data will be read as an XML file that will then be converted into JSON data for data exports. Or for data imports JSON data will be converted to XML data then imported into a web service or file. XML data being generated by the adaptor must have the xml version root element for the adaptor to be able to correctly make an XML document from JSON.
  • CSV
    The data will be read as a CSV file that will then be converted into JSON data for data exports. For data imports any text set up in the data imports will be directly exported out without any conversions.
HTTP Request Records Per Page

If the Protocol setting is set to either http:// or https:// then when data export is run to request data from a web service, if this setting has a value larger than 1, then it will indicate the amount of JSON, XML or CSV records that are expected to be returned from the web service. If the number of records equals or exceeds the amount of records returned then the adaptor will make another http request to get the next page of data. Many web services have limits on how many records can be returned, when this occurs then set this value, and for each data export place data hooks within the URL or body to ensure that the next page of data is requested, otherwise infinite loops may occur where the same page of data is requested again and again.

Connection Testing URL

If the Protocol setting is set to either http:// or https:// then specify the endpoint URL that can be used to test if a HTTP request can be set to web service configured with the settings above. The URL does not need to be the full URL, since the values in Protocol and Host/IP Address settings will be prepended to this URL.

Timeout Requests After Waiting If the Protocol setting is set to either http:// or https:// then set the number of seconds that the adaptor should wait for receiving a http response from the web service being called for a data export or import before giving up. Choose a number that is not too small or big, based on how slow or fast your internet connection is, and how long the web service data is being sent to or from can expect to take.

Data Source Type: CSV Spreadsheet File

Within the Generic adaptor data sources can be set up to read data from spreadsheet files that have been saved in the CSV file format. CSV (Comma Separated Values) is a file format that saves its data in a human readable form, that can be viewed within any text editing application such as Notepad, Gedit, as well as spreadsheet applications such as Microsoft Excel. The data in the file is structured in a table form where there contains columns and rows of data, with the first row containing the names of the columns. CSV files can be structured in different ways by using different characters to seperate the columns, rows, and cells of data. When setting up the CSV file data source in the adaptor there are settings that defined the characters that are used to tell how the CSV file data is read in. This allows great flexibility and customisation of reading in CSV files generated from different systems besides spreadsheet applications.

For each data export there is the ability to read in multiple CSV files of data. For example there may be four CSV files that contain product data. The Generic adaptor supports reading in all four of these files for a product data export, as long as the names and data of the files are consistent, eg. products1.csv, products2.csv, products10.csv. Settings exist for each data export to set a regular expression that will be used to match the files on, so based on the example a rule may be written to match files where their name is "product[anynumber].csv". The files themselves do not need to end with the .csv file extension, they could be set as .txt, .file, .dotwhatever. As long as the content in the file is saved in CSV text form then the adaptor will be able to read the data.

CSV files are very useful to use when a business system can output and read in data in the file format, or when using a spreadsheet applicable to manually manage the data. For example an accounting system may not support product flags, so Microsoft Excel spreadsheet may be used instead to manage a list of flags, and another spreadsheet used to manage the products assigned to flags. These spreadsheets can then be saved in the CSV file format that will allow the adaptor to read this data in and export it the an Ecommerce system set up with the Generic adaptor.

CSV Spreadsheet File Data Source Type Settings

Below are the settings that can be configured for each CSV data source type.

Setting Description
Data Source Label Label of the data source that allows people to recognise the type of data or location where data is read or written to.
CSV Files Folder Location Set the drive and root directory(s) on the file system where the CSV files are read or written to. The location does not need to specify the full folder location, as it will be prepended to the file path for each data export/import.
Data Field Delimiter

Set the characters within the CSV files being read or written that are used to split each data field for each data row.

Example CSV Data:
"value1","value2","value3"

The comma character is used to separate each data field.

Data Row Delimiter

Set the characters within the CSV files being read or written that are used to split each data row.

Example CSV Data:
"value1","value2"
"value3","value4"

The line feed and newline characters are used to separate each data row. The line feed and newline characters are invisible by default, however can be set by placing r and n values into this setting. 

Data Field Enclosing Character

Set the characters within the CSV files being read or written that are used to surround each value set within a data field. This ensures that the same characters used for data field and data row delimiters can be placed within the content of a data field.

Example CSV Data:
"value1","value2"
"value3","value4"

The double quote character is inclosing each data field's value. 

Data Field Escaping Character

Set the characters within the CSV files being read or written that are placed infront of Data Field Enclosing Characters to indicate that the data field enclosing characters should not be treated as such.

Example CSV Data:
"value1","val"ue2"
"valu"e3","value4"

The back slash character is escaping the double quote data field enclosing character, ensuring that the double quote character will appear in the data field values.

Data Source Type: ESD Web Service

Within the Generic adaptor data sources can be set up to read and write data to a web service using the HTTP protocol. This allows data to be retrieved from any location within a in-house computer network, or over the internet. The adaptor specifically supports obtaining data that is formatted in the Ecommerce Standards Documents (ESD) format. The ESD format provides a standard way to structure data that is passed to, and from Ecommerce systems and business systems. The generic adaptor supports reading and writing ESD data from a RESTful webservce that supports the ESD data format. The webservice itself is a piece of computer software that can receive HTTP requests and return HTTP responses, the same technology that allows people to view web pages over the internet.

Using a web service allows the generic adaptor to retrieve and write data in real time, which can be very handy when the web service is connected to a business system where up-to-date information needs to be obtained. An example of this is product stock quantities. The generic adaptor could connect to a webservice, which then connects to a business system and obtains the available stock quantities of products. This stock quantity data can then be returned to the Ecommerce system to show a person the current stock quantity of a product and whether it is available to be purchased or not.

For each data export and data import configured in the Generic adaptor there is the ability to set the URL of the web service that will be used to request data, or export data to the webservice. For the web service data source there are also settings to configure the IP address or domain used to connect to the web service, the protocol, as well as set any custom key value pairs in the HTTP request headers.

In order to develop a web service that allows connections with the Generic adaptor the webservice will need to conform to the ESD data format, as well as be a RESTful web service. More information about the Ecommerce Standards Documents can be found at https://www.squizz.com/esd/index.html

HTTP Ecommerce Standards Documents Data Source Type Settings

Below are the settings that can be configured for each HTTP Ecommerce Standards Document data source type.

Setting Description
Data Source Label Label of the data source that allows people to recognise the type of data or location where data is read or written to.
Protocol Sets the protocol that determines if data should be sent securely or insecurely from other web services using the HTTP web protocol . Set to one of the following options:
  • http://
    Data is sent to or from a web service through a public (the internet) or private (intranet) computer network using the http protocol. The data will not be encrypted and any computers passing the data along may be able to see or store the contents of the data. Avoid using this protocol when sending or receiving data across the public internet.
  • https://
    Data is sent to or from a web service through a public (the internet) or private (intranet) computer network using the http protocol. The data is securely encrypted and any computers passing the data along will not be able to read the contents of the data. Use this protocol when sending or receiving data across the public internet, such as from cloud based business systems. If using this protocol in internal computer networks ensure that security certificates are installed on the computer that the Connector is installed on, other connection issues may occur.
Host/IP Address

Set the IP address or website domain where the data is being read from or written to when the adaptor calls data exports and data imports to run. Additionally you can specify any root folders that are common to the web service, which will be prepended to each data export/imports URL.

Example values:

  • 192.168.0.22/examplews/endpoints/
    Address to a web service running on a computer within an internal network
  • www.squizz.com/rest/1/org/
    URL to a public cloud based web service available on the internet.
HTTP Request Headers

Set the headers that will be placed in all data export/data import http requests made to the web service to obtain or push data. Common headers include:

Content-Type: application/json
Authorization: Basic dXNlcm5hbWU6cGFzc3dvcmQ=
Accept-Encoding: gzip

Connection Testing URL

Specify the endpoint URL that can be used to test if a HTTP request can be set to web service configured with the settings above. The URL does not need to be the full URL, since the values in Protocol and Host/IP Address settings will be prepended to this URL.

Timeout Requests After Waiting Set the number of seconds that the adaptor should wait for receiving a http response from the web service being called for a data export or import before giving up. Choose a number that is not too small or big, based on how slow or fast your internet connection is, and how long the web service data is being sent to or from can expect to take.

Data Source Type: MS SQL Server

Within the Generic adaptor data sources can be set to read and write data to a Microsoft (MS) SQL Server database using the ADO.NET database libraries made available through the DOT.NET framework that the Connector software uses. This allows data to be retreived from any location within an in-house computer network, or over the wider internet if allowed. There are many business systems that use a MS SQL Server to store the system's data, and the MS SQL Server data source type within the Generic adaptor can be used to retrieve and write data for such business systems.

For the Generic adaptor's MS SQL Server data source type, each of the data exports and imports are configured to use the SQL (Structured Query Language). SQL queries can be setup to query and read data from tables within the database, as well as write data into specified tables. This makes it easy for a data specialist who understands the database table structure of a business system, to create SQL queries which can obtain data the from business system and push it back into an Ecommerce System,

For the adaptor to write data back into an MS SQL Server database (such as for sales orders) it is recommended to create placeholder database tables that the Generic adaptor is configured to write data into, then a 3rd party piece of software is developed that can interpret the data and move it to all the database tables that its business system requires.

MS SQL Server Data Source Type Settings

Below are the settings that can be configured for each MS SQL Server data source type.

Setting Description
Data Source Label Set the label of the data source that allows people to recognise the type of data or location where data is read or written to.
MS SQL Server Data Source Location

Set the computer name or IP address that is running the MS SQL Server instance as well as the SQL Server name.

Examples:

  • 192.168.0.20//SQLExpress
  • network-computer-name//EgSQLServer
MS SQL Server Data Source Name

Set the name of the database or catalogue that exists within the MS SQL Server instance.

Data Source User Name

Set the name of the user that is used to access the SQL Server instance. Ensure that the user is set up and has permissions to access the specified database/catalogue.

Data Soure User Password

Set the password of the user that is used to access the SQL Server instance. Ensure that the user is set up and has permissions to access the specified database/catalogue.

Connection Testing SQL Query

Set an example SQL query that is used to test that a connection can be made to the configured SQL Server instance, and can successfully return data. Avoid setting a SQL query that returns lots of data. Set an SQL query such as: SELECT 1

Write Data Using Transactions If set to Yes then when the adaptor's data imports are configured to import data into the SQL Server instance, if multiple record inserts occur, then only if all successfully execute will all the data be written to the database in a single transaction. If set to No and one of multiple database inserts successfully executes, then the data will remain in the database and could exist in a partially completed state.
Write Data Using Prepared Statements If set to Yes them when the adaptor's data imports are configured to import data into a SQL Server instance, each data field will be added to an insert query using prepared statements. This ensures that the values being inserted cannot break the syntax of the SQL query, however in the Connector logs it means that you cannot see the values being placed within SQL insert queries. Its best to leave this setting set to Yes unless you are initially setting up the adaptor, or else need to diagnose insert issues.

Data Source Type: ODBC (Compatible Databases)

Within the Generic adaptor data sources can be set to read and write data to a database using the ODBC (Open DataBase Connectivity) protocol. ODBC is an older, and widely used protocol which is supported by many different database technologies, such as MySQL, PostgreSQL, Microsoft Access, among many others. It can be used allow data to be retreived from any database location within an in-house computer network, or over the wider internet if allowed. There are many business systems that use ODBC enabled databases to store the system's data, and the ODBC data source type within the Generic adaptor can be used to retrieve and write data for such business system databases.

For the Generic adaptor's ODBC data source type, each of the data exports and imports are configured to use the SQL (Structured Query Language). SQL queries can be setup to query and read data from tables within the database, as well as write data into specified tables. This makes it easy for a data specialist who understands the database table structure of a business system, to create SQL queries which can obtain data the from business system and push it back into an Ecommerce System,

For the adaptor to write data back into an ODBC enabled database (such as for sales orders) it is recommended to create placeholder database tables that the Generic adaptor is configured to write data into, then a 3rd party piece of software is developed that can interpret the data and move it to all the database tables that its business system requires.

ODBC Data Source Type Settings

Below are the settings that can be configured for each ODBC data source type.

Setting Description
Data Source Label Set the label of the data source that allows people to recognise the type of data or location where data is read or written to.
ODBC Data Source Name

Set the name of the Data Source Name (DSN) set up within the Windows ODBC 32bit Data Sources application. This determines the database that will be accessed, the Windows user who will access it and the ODBC driver used to access it.

ODBC User Name

Set the name of the user that is used to access the database being connected to through ODBC. Ensure that the user is set up and has permissions to access the specified database.

ODBC User Password

Set the name of the user that is used to access the database being connected to through a ODBC connection. Ensure that the user is set up and has permissions to access the specified database.

Connection Testing SQL Query

Set an example SQL query that is used to test that a connection can be made to the configured database using a ODBC connection. Avoid setting a SQL query that returns lots of data. Set an SQL query such as: SELECT 1

Write Data Using Transactions If set to Yes then when the adaptor's data imports are configured to import data into the database instance, if multiple record inserts occur, then only if all successfully execute will all the data be written to the database in a single transaction. If set to No and one of multiple database inserts successfully executes, then the data will remain in the database and could exist in a partially completed state. Only set to Yes if the ODBC driver being used to connect to the database supports transactions.
Write Data Using Prepared Statements If set to Yes them when the adaptor's data imports are configured to import data into a database instance, each data field will be added to an insert query using prepared statements. This ensures that the values being inserted cannot break the syntax of the SQL query, however in the Connector logs it means that you cannot see the values being placed within SQL insert queries. Set this value to Yes only if the ODBC driver being used supports transactions and using placeholders in SQL queries.

Data Source Type: SQLite Database

Within the Generic adaptor data sources can be set to read and write data to a SQLite database file. SQLite databases are relational database management systems that can be embedded in single files, and be easily distributed or transferred. SQLite is one of the most popular databasing technologies, and is heavily used for storing data on mobile devices, such as iOS, Android, and other operating systems.

For the Generic adaptor's SQLite data source type, each of the data exports and imports are configured to use the SQL (Structured Query Language). SQL queries can be setup to query and read data from tables within the database, as well as write data into specified tables. This makes it easy for a data specialist who understands the database table structure to create SQL queries which can obtain data the from SQLite database and push it back into an Ecommerce System,

Note that SQLite databases support multiple threads reading a database at the same time, but for writing data to a table in the database it may place locks on the database. The Generic adaptor uses the System.Data.SQLite driver to read and write data from a SQLite version 3 database file. Visit the driver's website to find out more information about the SQL functions and database operations that are supported.

SQLite Data Source Type Settings

Below are the settings that can be configured for each SQLite data source type.

Setting Description
Data Source Label Set the label of the data source that allows people to recognise the type of data or location where data is read or written to.
SQLite Database File Path

Set the full file path to the SQLite database file (version 3) that is stored on the file system. If the file is stored on a network drive, then ensure that a UNC computer name is used in the path instead of a mapped network drive. Mapped network drives normally cannot be seen by the Connector's Host Windows Service that runs as a system Windows user.

Connection Testing SQL Query

Set an example SQL query that is used to test that a connection can be made to the configured SQLite database file. Avoid setting a SQL query that returns lots of data. Set an SQL query such as: SELECT 1

Write Data Using Transactions If set to Yes then when the adaptor's data imports are configured to import data into the SQLite database file, if multiple record inserts occur, then only if all successfully execute will all the data be written to the database in a single transaction. If set to No and one of multiple database inserts successfully executes, then the data will remain in the database and could exist in a partially completed state.
Write Data Using Prepared Statements If set to Yes them when the adaptor's data imports are configured to import data into a SQLite database file, each data field will be added to an insert query using prepared statements. This ensures that the values being inserted cannot break the syntax of the SQL query, however in the Connector logs it means that you cannot see the values being placed within SQL insert queries. Typically only set to No when first configuring the adaptor or diagnosing data import issues.

Data Export Types

The Generic adaptor supports exporting data from the following types of data sources, based on the Data Source Type assigned to each the adaptor's data exports .

Spreadsheet CSV Data Exports

For CSV spreadsheet data sources each data export needs to be configured to map the column in the spreadsheet file to the Ecommerce Standards Document record field. Some data exports will have their data joined together when the data is converted in to the standards objects, and exported to the relevant Ecommerce system. Each data export can be configured to read from one or more CSV files. For each data export by placing the wildcard asterisk character in the "CSV Text File Name Matcher" filename field, it allows multiple files to be read that contain the same prefix or suffix, eg. myfile*.csv will match files named myfile_1.csv and myfile_2.csv. The exports can be configured to read CSV data from any file extension, not just CSV files. The data source also supports reading in files that use different character delimiters to separate data, such as tabs, spaces. There are data source settings where you get to define the characters that delimit field data, as well as rows. Additionally there are settings that allow you to specify the characters that can escape a character that would normally be used to delimit data, or rows.

SQL Database Based Data Exports

For ODBC, MS SQL Server, and SQLite database data sources each data export needs to be configured to read data from a specified database table (or joined tables, or view(s)), and map the fields in the table(s)/views to the Ecommerce Standards Document record fields. Some data exports with have their data joined together when the data is converted in to the standards objects, and exported to the relevant Ecommerce system.

Ecommerce Standards Documents Webservice Data Exports

For the ESD webservice data sources the exports are configured to send a HTTP get request to a compatible webservice which will return data in the native ESD serialized JSON format, which the connector will simply forward on the relevant Ecommerce system. Some of the data exports listed below are not required to be configured to an ESD data source since the data export's settings are not required to obtain the relevant data.

JSON/XML/CSV - Webservice/File Data Exports

For the JSON/XML/CSV webservice or file based data exports they first need a data source type to be set up so that the adaptor knows whether to make a HTTP request to retreive the JSON, XML or CSV data from a computer network or internet location, or otherwise from a file stored on the file system. This is done by setting to to Protocol of the data source to either "http://", "https://", or "file:///". For HTTP connections the domain and header information will need to be set to allow connections to be made to a webservice. For each data export you can then set the relative URL path to call the most appropriate endpoint that will be used to return data. When setting up the data source you will need to specify if XML, CSV or JSON data should be expected to be returned from the web service or locally found file. For some webservices they may place limitations on how many records may be returned in one request. When this occurs then data source type has a setting that can specify the maximum number of records "per page" that will be returned on one request, and the adaptor can make subsequent requests to the same web service endpoint until all data has been obtained.

IF XML or CSV data has been returned from either the web service or local file then it will first be converted into JSON data. After the JSON data has been successfully processed, then for each data export the "JSON Record Set Path" is used to locate the objects within the JSON data structure that represent records. The JSON Record Set Path needs to be set using a JSONPath formatted string. Using JSONPath you can place conditions on which objects that are returned. For example dataRecords[*] would return all objects found within the array assigned to the dataRecords key in JSON data. Another example Items[*].SellingPrices[?(@.QuantityOver <= 0)] would find price records within the Selling Prices array that have the attribute "QuantityOver" with a value less than or equal to 0. The Selling Prices array would need to be a child of objects stored in the Items array. For CSV files the path will need to be set to dataRecords[*]. Multiple paths can be set by placing the --UNION-- text between each path. When this occurs each path between the --UNION-- delimiter will be called sequentially to read through the JSON/XML/CSV data, then its found rows will be appended to previous records obtained. This works similar to the UNION clause in the SQL database language.

Once the adaptor has obtained a list of records from the JSON data, it will then iterate through each record and and allow data to be retreived from each JSON record object. Each record object's values can then be read into each data field for the data export. Additionally for each record field there is the ability to place functions that allow traversal through the JSON data structure to obtain the required data, as well as allow the data to be manipulated before being placed in the Ecommerce Standards Document record. Once all JSON record objects have been read then the data (which has been standardised) can then be sent on to the corresponding Ecommerce system that is receive the data.

Data Export Pagination

When a data export is called, multiple requests may need to be called to external web services to return a full set of records to export. For example a web service may only return at maximum 100 products at a time, so additional requests may need to occur to get the next "page" of records (similar to if were turning a page to view the next set of products within a printed catalogue). The Generic adaptor's JSON/XML/CSV web service/file data source type supports making multiple requests for all data exports, also known as "Pagination" to collate a full set of records. Only once a full set of records is collated will the adaptor then export the data in its entirety. Within the  JSON/XML/CSV web service/file data source type  settings window a setting exists called "HTTP Request Records Per Page". If this is set to a number larger then 0 then it tells the adaptor how much records it expects to receive per page when running any data exports assigned to the data source. If the number of records retrieved is greater than the number specified within the setting then the adaptor will make an additional request to get the next page of data.

Example

In the example below 2 requests must be made to the example.com web service to return all 5 product records that exist. Each request only allows a maximum of 3 records to be returned, and the web service allows a parameter called "page" that controls which page of products is returned. When the Generic adaptor's Products data export is run it can be configured to call the web service many times (2 in this case) until all pages of product data has been obtained.

Page 1 Data. URL http://www,example.com/api/products?page=1

Record NumberProduct NameCode
1Tea TowelTEATOW123
2CupCP222
3TeaTEALIQ

 

Page 2 Data, URL http://www,example.com/api/products?page=2

Record NumberProduct NameCode
4Hand SoapHSM247
5SpongeSPS22322

 

Full Record Set pages of records collated and exported by Generic adaptor's Products data export

Record NumberProduct NameCode
1Tea TowelTEATOW123
2CupCP222
3TeaTEALIQ
4Hand SoapHSM247
5SpongeSPS22322

There are 2 ways in which records are counted for each page of data requested. The first is based on the number of records that are returned using the path set in the JSON RecordSet Path setting configured in the data export. Alternatively if a JSON RecordSet path is set in the "Page Record Count Path" setting for a data export, the adaptor instead will locate the object in the JSON data structure and get the count of the number of children objects assigned to it. This second option allows you to independently get the number of records found in a page of data, separate to the number of records being processed by the data export.

For example if a request is made to a web service to get all the pricing of products and the web service returns product records with pricing included for each product record, then in the "JSON Record Set Path" you may need to set the path to data.products[*].prices[*], however that could return 0 price records for a full page of products that have no assigned pricing data. If the web service is paging on product records and the adaptor is paging on pricing records, then the adaptor's data export may stop trying to request more pages of prices when 1 page of products contains no pricing records. You can mitigate this problem by setting the data export's "Page Record Count Path" setting to data.products to tell the data export to count the number of records that exist under the data.products object, instead of the pricing records being processed.

Within the URL, request headers, request body and record set path settings of each data export in the adaptor you can set the following data hooks below to control how the URL can be changed to get multiple pages of data. Examples:

Example: Paginate By Page Number

http://www,example.com/api/products?page={pageNumber}
Output:
http://www,example.com/api/products?page=1
http://www,example.com/api/products?page=2
http://www,example.com/api/products?page=3
 

Example: Paginate By Last Record Of Previous Page

http://www,example.com/api/products?starting-product-id={lastKeyRecordIDURIencoded}&recordSize={recordsPerPage}
Output:
http://www,example.com/api/products?starting-product-id=&recordSize=100
http://www,example.com/api/products?starting-product-id=GAB332&recordSize=100
http://www,example.com/api/products?starting-product-id=GFH33211&recordSize=100

Example: Paginate By Record Index

http://www,example.com/api/products?starting-index={pageOffset}&recordSize={recordsPerPage}
Output:
http://www,example.com/api/products?starting-index=0&recordSize=100
http://www,example.com/api/products?starting-index=100&recordSize=100
http://www,example.com/api/products?starting-index=200&recordSize=100

JSON/XML/CSV Data Hooks

Any of the following data hooks may be embedded within the data export "Data Export URL", "JSON RecordSet Path", "HTTP Request Headers", "HTTP Request Body" settings. which will get replaced with values allowing record searches to be set up within JSON/XML/CSV webservice/file requests:

  • {recordsPerPage}
    Number of records to retrieve for each page of records requested. This number is the same as set within the data export's assigned JSON/XML/CSV data source type.
  • {pageNumber}
    The current page number being processed by the data export. The page number starts from number 1.
  • {pageIndex}
    The current page number being processed by the data export, with the page number offset by -1. Eg. for page 1 this hook would return 0, for page 2 it returns 1.
  • {pageOffset}
    The current page number being processed by the data export multiplied by the amount of records per page. For example if each page was configured in the export's assigned JSON/XML/CSV data source type to contain 100 records, then when the export is called to retrieve the 1st page of data this hooks value would be 0, then 100, then 200 ((page 3 - 1) * 100 records per page).
  • {pageIdentifier}
    Gets the identifier of the page from the JSON path set in the Next Page Identifier Path of the data export. This can be used to specifically obtain a certain identifier in the returned response data, that identifies a value to use in the next request to obtain the next page of data. Initially the value is empty of the first web page request.
  • {lastKeyRecordID}
    Gets the key identifier of the last record that was retrieved by the data export. This is useful if pages of records are paginated not by number, but based on alphabetical order, using the key (unique identifer) of the record as the starting basis to retrieve the next page of records.
  • {lastKeyRecordIDXMLencoded}
    Same value as {lastKeyRecordID}, except that the value has been XML encoded to allow it to be safely embedded within XML or HTML data without breaking the structure of such documents.
  • {lastKeyRecordIDURIencoded}
    Same value as {lastKeyRecordID}, except that the value has been URI encoded to allow it to be safely embedded within URLs (avoiding breaking the syntax of a URL).


Dynamic HTTP Request Generation

When a JSON/XML/CSV Web Service/File data source type is assigned to a data export, within each of the data export settings (Data Export URL, JSON RecordSet Path, Page Record Count Path, HTTP Request Headers, HTTP Request Body, and Data Export Result Matcher), there is the ability to call the JSON/XML/CSV field functions to determine the final value set within each of these fields. To do so set the following text within any of the fields:

==JSON-FUNC-EVAL==

JSON-FUNC-EVAL is short for "JSON Function Evaluate". Any text placed after this keyword will be treated as JSON data, allowing any of the JSON/XML/CSV field functions to be used to determine the request field's value.

For example placing the following text into the HTTP Request Body will allow the HTTP request body to have its content dynamically set:

Example 1:

Setting Value: ==JSON-FUNC-EVAL==CONCAT('timeout=',CALC('5*60*1000'))
Output: timeout=300000

Request JSON Data
Each of the request evaluating functions can utilise the following request JSON object data:
{
    "nonce": 4859b7f7f7534f2087cec23d977fe299,
    "datetimenow_utc_unix": 1556520888402,
    "request_url": "https://example.com/get/data/from/webservice?param1=a&param2=b",
    "request_http_method": "GET"
}

Request JSON DataDescription
nonceReturns a random generated value, based on a Universal Unique Identifier (UUID). The value generated at the time a request is made is guaranteed to be unique compared to values generated at any other time.
datetimenow_utc_unixReturns the current date time at the request is made. the date time is a 64bit long integer storing the amount of milliseconds since the 01/01/1970 UTC epoch. This value can be converted into human readable date time format using the DATESTR_CONVERT function.
request_urlContains the fully evaluated request URL that will be used to call the data source to get or push data to. The URL is composed of the data source's protocol, Host/IP address, and the data exports/imports URL.
request_http_methodContains the HTTP request method used to call the configured data source, based on the HTTP Request Method set within the data export or data import.

 

Saving HTTP Response Data To File

When the JSON/XML/CSV Web Serice/File data source type is assigned to a data export, if the data source type is configured to obtain data from a web service using a HTTP or HTTPS request, there is the ability to save the returned data in the HTTP repsonse body to a file before the data is processed. This is useful if multiple data exports need to export out the same data, as well as for testing and diagnostics to check the raw data being returned from a web service.

Within the Generic adaptor's settings window, under the Data Exports tab, when a JSON/XML/CSV Web Service/File data source type is assigned to a data export, within the JSON/XML/CSV Settings tab, a setting labelled "Save HTTP Response Data To Folder" exists. If the setting is not left empty and configured with a full file path (including file name and extension) then when a HTTP request is made to obtain data from a web service, the data returned from the web service's HTTP response body will be saved to a file at the configured file system location and file name. If the file path contains the data hook {pageNumber} then the pageNumber hook will be replaced by the page number of the request made to obtain the data. This allows many pages of data to be separately saved to the file system, useful for when pagination occurs to obtain a full data set from from the web service across multiple requests.

Before the Generic adaptor's data export saves any data to a folder location, it will create a file labelled "Data-Files-Incomplete.lock" in the same location. This ensures that if multiple pages of data a required to obtain a full record set from the web service, that all the page's data files need to be fully saved to the file system. If this doesn't occur, then the lock file will remain in place to denote that the directory contains incomplete data that should not be read. Once all the page's request data has been successfully saved then the lock file will be removed. If this lock file is in place when other data exports try to read from the same directory then an error will occur and the data exports will fail. This ensures that any linked data exports to the same data gracefully fail and don't export incomplete data. The check for the lock file can be ignored by placing the parameter "?ignore-data-files-incomplete-lock" at the end of the file path. When this occurs the adaptor will continue to look for files regardless of whether a lock file exists or not.

Data Exports

The Generic adaptor has the following data exports listed in the table below that can be configured to read data from one of its chosen ODBC database, MS SQL Server database, CSV spreadsheet file(s), ESD supported web service or JSON/XML/CSV based data sources. The obtained data is then converted into a Ecommerce Standard Document, and exported from the adaptor into the configured Ecommerce system or Connector Data Set database for importing.

Data Export Definitions

Data Export Description
Adaptor Status Returns the status of the adaptor's connection to its data source.
Attribute Profile Attributes

Obtains a list of attributes assigned to an attribute profile. This data is mapped to the ESDRecordAttribute record when exported. The data export is joined with the Attribute Profile Product Values and Attribute Profiles data exports to construct an ESDocumentAttribute document which is exported out in its final state to the Ecommerce system.

For ESD webservice data sources this data export is not used and does not need to be configured.

Attribute Profile Product Values Obtains a list of attribute values assigned to products. This data is mapped to the ESDRecordAttributeValue record when exported. The data export is joined with the Attribute Profile Attributes and Attribute Profiles data exports to construct an ESDocumentAttribute document which is exported out in its final state to the Ecommerce system.

For all data source types this data export must be configured to allow attribute data to be exported.

Attribute Profiles

Obtains a list of attribute profiles. This data is mapped to the ESDRecordAttributeProfile record when exported. The data is joined with the Attribute Profile Attributes and Attribute Profile Product Values data exports to construct a ESDocumentAttribute document which is exported out in its final state to the Ecommerce system.

For ESD webservice data sources this data export is not used and does not need to be configured.

Category Trees

Obtains a list of category trees. This data is mapped to the ESDRecordCategoryTree record when exported. The data is joined with the Categories data export to construct an ESDocumentCategory document which is exported out in its final state to the Ecommerce system,

For ESD webservice data sources this data export is not used and does not need to be configured.

Categories

Obtains a list of categories. This data is mapped to the ESDRecordCategory record when exported. The data is joined with the Category Products data export to construct an ESDocumentCategory document which is exported out in its final state to the Ecommerce system,

For all data source types this data export must be configured to allow category data to be exported.

Category Products Obtains a list of mappings that assign products to categories. This data is placed within an ESDRecordCategory record when exported. The data is joined with the Categories data export to construct an ESDocumentCategory document which is exported out in its final state to the Ecommerce system,

For ESD webservice data sources this data export is not used and does not need to be configured.

Currency Exchange Rates Obtains a list of currency exchange rates. Each currency exchange rate record contains the rate that  determines how much it would cost to buy one currency with another. The data is placed within an ESDRecordCurrencyExchangeRate record when exported. The records are then all placed into a ESDocumentCurrencyExchangeRate document which is exported out in its final state to the Ecommerce system.
Combination Product Parents

Obtains a list of products that are set as the parent product of an assigned combination. Each product is assigned to a combination profile. The data is placed within an ESDRecordProductCombinationParent record when exported. The data is joined with the Combination Products, Combination Profile Field Values, Combination Profile Fields, and Combination Profiles data exports to construct an ESDocumentProductCombination document which is exported out in its final state to the Ecommerce system.

For all data source types this data export must be configured to allow product combination data to be exported.

Combination Products Obtains a list of mappings that allow a product to be assigned to a parent product for a given field and value in a combination. The data is placed within an ESDRecordProductCombination record when exported. The data is joined with the Combination Product Parents, Combination Profile Field Values, Combination Profile Fields, and Combination Profiles data exports to construct an ESDocumentProductCombination document which is exported out in its final state to the Ecommerce system.

For ESD webservice data sources this data export is not used and does not need to be configured.

Combination Profile Field Values Obtains a list values that are associated to fields in a combination profile. The data is placed within an ESDRecordCombinationProfileField record when exported. The data is joined with the Combination Product Parents, Combination Products, Combination Profile Fields, and Combination Profiles data exports to construct an ESDocumentProductCombination document which is exported out in its final state to the Ecommerce system.

For ESD webservice data sources this data export is not used and does not need to be configured.

Combination Profile Fields Obtains a list of fields that are associated to a combination profile. The data is placed within an ESDRecordCombinationProfileField record when exported. The data is joined with the Combination Product Parents, Combination Products, Combination Profile Field Values, and Combination Profiles data exports to construct an ESDocumentProductCombination document which is exported out in its final state to the Ecommerce system.

For ESD webservice data sources this data export is not used and does not need to be configured.

Combination Profiles Obtains a list of combination profiles. The data is placed within an ESDRecordCombinationProfile record when exported. The data is joined with the Combination Product Parents, Combination Products, Combination Profile Field Values, and Combination Profile Fields data exports to construct an ESDocumentProductCombination document which is exported out in its final state to the Ecommerce system.

For ESD webservice data sources this data export is not used and does not need to be configured.

Contract Customer Accounts

Obtains a list mappings between customer accounts and contracts. The data is placed within an ESDRecordContract record when exported. The data is joined with the Contract Customer Products and Contracts data exports to construct an ESDocumentCustomerAccountContract document which is exported out in its final state to the Ecommerce system.

For ESD webservice data sources this data export is not used and does not need to be configured.

Contract Products

Obtains a list mappings between products and contracts. The data is placed within an ESDRecordContract record when exported. The data is joined with the Contract Customer Accounts and Contracts data exports to construct a ESDocumentCustomerAccountContract document which is exported out in its final state to the Ecommerce system.

For ESD webservice data sources this data export is not used and does not need to be configured.

Contracts

Obtains a list of contracts. The data is placed within an ESDRecordContract record when exported. The data is joined with the Contract Customer Accounts and Contract Products data exports to construct an ESDocumentCustomerAccountContract document which is exported out in its final state to the Ecommerce system.

For all data source types this data export must be configured to allow product combination data to be exported.

Customer Account Addresses Obtains a list of addresses assigned to customer accounts. The data is placed within a ESDRecordCustomerAccountAddress record when exported. The record data is all placed into a ESDocumentCustomerAccountAddress document which is exported out in its final state to the Ecommerce system.
Customer Account Payment Types Obtains a list of payment types assigned to specific customer accounts. The data is placed within an ESDRecordPaymentType record when exported. The data is joined with the Customer Accounts data export to construct an ESDocumentCustomerAccount document which is exported out in its final state to the Ecommerce system.
Customer Accounts Obtains a list of customer accounts. The data is placed within a ESDRecordCustomerAccount record when exported. The record data is joined with the Customer Account Payment Types data export to construct a ESDocumentCustomerAccount document which is exported out in its final state to the Ecommerce system.

For all data source types this data export must be configured to allow customer account data to be exported.
Data Source Session Create If assigned to JSON/XML/CSV Service/File data source type then this data export will be called to attempt to login and create a new session with the data source's Web Service. If successful then a session ID returned by the web service can then be passed on an used with subsequent calls the web service in other data exports. This data export will get called any time any other data export is assigned to a JSON/XML/CSV Service File data source type, and only if a session ID is able to be obtained will the other data export get called to run.
Data Source Session Destroy If assigned to JSON/XML/CSV Service/File data source type then this data export will be called to attempt to logout and destroy and existing session with the data source's Web Service, based on the Data Source Session Create data export being called. This data export will get called any time any other data export is assigned to a JSON/XML/CSV Service File data source type, and only if a session ID is able to be obtained in a previous Data Source Session Create data export call. It is highly advisable to configure this data export if the Data Source Session Create data export is configured. This ensures that sessions are correctly destroyed and do not live longer than the absolute minimum amount of time to avoid 3rd party software from gaining access to sensitive information.
Flagged Products Obtains a list of mappings between products and flags. The data is placed within a ESDRecordFlagMapping record when exported. The data is joined with the Flags data exports to construct a ESDocumentFlag document which is exported out in its final state to the Ecommerce system.

For all data source types this data export must be configured to allow product combination data to be exported.

Flags Obtains a list of flags. The data is placed within a ESDRecordFlag record when exported. The data is joined with the Flags data exports to construct a ESDocumentFlag document which is exported out in its final state to the Ecommerce system.

For ESD webservice data sources this data export is not used and does not need to be configured.

General Ledger Accounts Obtains a list of General Ledger Accounts. The data is placed within a ESDRecordGeneralLedgerAccount record when exported. The records are then all placed into a ESDocumentGeneralLedgerAccount document which is exported out in its final state to the Ecommerce system.

 

Item Group Products Obtains a list of item groups assigned to specific products. The data is placed within an ESDRecordItemGroup record when exported. The data is joined with the Item Groups data export to construct an ESDocumentItemGroup document which is exported out in its final state to the Ecommerce system.
Item Groups

Obtains a list of item groups. The data is placed within an ESDRecordItemGroup record when exported. The data is joined with the Item Group Products data export to construct an ESDocumentItemGroup document which is exported out in its final state to the Ecommerce system.

For all data source types this data export must be configured to allow item group data to be exported.

Item Relations Obtains a list of item relations, that allow one product to be linked to another product. The record data is placed within an ESDRecordItemRelation record when exported. The records are then all placed into a ESDocumentItemRelation document which is exported out in its final state to the Ecommerce system.
Location Customer Accounts Obtains a list of mappings between customer accounts and locations. The data is placed within a ESDRecordCustomerAccountrecord when exported. The data is joined with the Customer Accounts data exports to construct a ESDocumentCustomerAccount document which is exported out in its final state to the Ecommerce system.

For ESD webservice data sources this data export is not used and does not need to be configured.

Location Products Obtains a list of mappings between products and locations, with each product defining the stock quantity available at each location. The data is placed within an ESDRecordStockQuantity record when exported. The data is joined with the Locations, and Location Customer Accounts data exports to construct an ESDocumentLocation document which is exported out in its final state to the Ecommerce system.

For ESD webservice data sources this data export is not used and does not need to be configured.

Locations Obtains a list of locations. The data is placed within an ESDRecordLocation record when exported. The data is joined with the Location Products, and Location Customer Accounts data exports to construct an ESDocumentLocation document which is exported out in its final state to the Ecommerce system.

For all data source types this data export must be configured to allow location data to be exported.

Makers Obtains a list of makers, also known as manufacturers who create products and models. The record data is all placed into a ESDocumentCustomerMaker document which is exported out in its final state to the Ecommerce system.
Maker Models Obtains a list of maker models, these models are created by makers/manufacturers and may be made up of several products/parts. The data is placed within an ESDRecordMakerModel record when exported. The data is joined with the Maker Model Attributes data export to construct an ESDocumentMakerModel document which is exported out in its final state to the Ecommerce system.

For all data source types this data export must be configured to allow maker model data to be exported.

Maker Model Attributes Obtains a list of attributes for maker models. These attributes allow any kind of data to be set against model records. The data is placed within an ESDRecordAttributeValue record when exported. The data is joined with the Maker Model data exports to construct an ESDocumentMakerModel document which is exported out in its final state to the Ecommerce system.
Maker Model Mappings Obtains a list of maker model mappings, that allow products/parts to be assigned to maker models for given categories. The data is placed within an ESDRecordMakerModelMapping record when exported. The data is joined with the Maker Model Mapping Attributes data export to construct an ESDocumentMakerModelMapping document which is exported out in its final state to the Ecommerce system.

For all data source types this data export must be configured to allow maker model mapping data to be exported.

Maker Model Mapping Attributes Obtains a list of attributes for maker model mappings. These attributes allow any kind of data to be set against model mapping records. The data is placed within an ESDRecordAttributeValue record when exported. The data is joined with the Maker Model Mapping data exports to construct an ESDocumentMakerModelMapping document which is exported out in its final state to the Ecommerce system.
Payment Types Obtains a list of payment types. The record data is placed within an ESDRecordPaymentType record when exported. The records are then all placed into a ESDocumentPaymentType document which is exported out in its final state to the Ecommerce system.
Price Levels Obtains a list of price levels. The data is placed within an ESDPriceLevel record when exported. The record data is all placed into an ESDocumentPriceLevel document which is exported out in its final state to the Ecommerce system.
Product Alternate Codes Obtains a list of alternate codes assigned to products. The data is placed within an ESDRecordAlternateCode record when exported. The record data is all placed into an ESDocumentAlternateCode document which is exported out in its final state to the Ecommerce system.
Product Attachments Obtains a list of attachment files assigned to products. The data is placed within an ESDRecordAttachment record when exported. The adaptor will locate the file set in the record, and if found will upload the file to the Ecommerce system.
Product Customer Account Pricing Obtains a list of product prices assigned to specific customer accounts or pricing groups. The data is placed within an ESDRecordPrice record when exported. The data is joined with the Product Customer Account Pricing Groups data export to construct an ESDocumentPrice document which is exported out in its final state to the Ecommerce system.
Product Customer Account Pricing Groups Obtains a list of mappings between customer accounts and pricing groups. The data is collated and joined with the Product Customer Account Pricing data export to construct an ESDocumentPrice document which is exported out in its final state to the Ecommerce system.
Product Images Obtains a list of image files assigned to products. The data is placed within an ESDRecordImage record when exported. The adaptor will locate the file set in the record, and if found check that it is an image file, perform any required resizing, then upload the file to the Ecommerce system.
Product Kits Obtains a list of component products assigned to kits. The data is placed within an ESDRecordKitComponent record when exported. The record data is all placed into an ESDocumentKit document which is exported out in its final state to the Ecommerce system.
Product Pricing Obtains a list of price-level prices assigned to products. The data is placed within an ESDRecordPrice record when exported. The record data is all placed into an ESDocumentPrice document which is exported out in its final state to the Ecommerce system.
Product Quantity Pricing Obtains a list of price-level quantity based prices assigned to products. The data is placed within an ESDRecordPrice record when exported. The record data is all placed into an ESDocumentPrice document which is exported out in its final state to the Ecommerce system.
Product Sell Units Obtains a list of sell units assigned to specific products. The data is placed within an ESDRecordSellUnit record when exported. The data is joined with the Products data export to construct an ESDocumentProduct document which is exported out in its final state to the Ecommerce system.
Product Stock Quantities

Obtains a list of stock quantites associated to one, or a list of products. The data is placed within an ESDRecordStockQuantity record when exported. The record data is all placed into an ESDocumentStockQuantity document which is exported out in its final state to the Ecommerce system.

Data Hooks:
The following hooks contain be embedded within the data export's settings to control how individual or all product stock quantities are exported .

  • {obtainAll}: Outputs either Y or N. If Y - Yes then indicates that the data export needs to retrieve stock quantity records for all products, else if set to N then the data export should only obtain a stock quantity record for an individual product.
  • {keyProductID}: key identifier of the product to obtain a stock quantity record for. This should only be used used if the {obtainAll} data hook is set to N. The value outputted is URL encoded.
  • {keyProductIDRaw}: key identifier of the product to obtain a stock quantity record for. This should only be used used if the {obtainAll} data hook is set to N. The value outputted is the raw value. If used within http requests it should be encoded to avoid injection attacks.
Products Obtains a list of products. The data is placed within an ESDRecordProduct record when exported. The product record data is joined with the Product Sell Units data export and together all placed into an ESDocumentProduct document which is exported out in its final state to the Ecommerce system.
Purchasers Obtains a list of purchasers. The data is placed within an ESDRecordPurchaser record when exported. The records are then all placed into a ESDocumentPurchaser document which is exported out in its final state to the Ecommerce system.
Sale Orders Obtains a list of sales orders. The data is placed within an ESDRecordOrderSale record when combined into a list. The list of sales order records is used to call the Sales Order export to retrieve the full details for each sales order record found.
Sales Order

Obtains the details of a single sales order record. The data is placed within an ESDRecordOrderSale record when combined into a list. The records are then all placed into a ESDocumentOrderSale document which is exported out in its final state to the Ecommerce system.

Data Hooks:
The following hooks contain data obtained for each sales order record obtained from the Sales Orders data export and may be used the in data export fields.

  • {keySalesOrderID}: Key Identifier of the sales order
  • {salesOrderCode}: Code of the sales order
  • {keyCustomerAccountID}: Key identifier of the customer account
  • {customerAccountCode}: Code of the customer account
  • {createdDate}: date the sales order was created, in milliseconds since the 01/01/1970 12AM UTC epoch
  • {salesOrderIndex}: index of the sales order record being obtained.
Sales Order Lines Obtain a list of lines for a single sales order obtained from the Sales Order data export. The line data is placed within an ESDRecordOrderSaleLine record and combined into a list of line records. The line records are then all placed into a ESDocumentOrderSale document for the sales order record obtained in the Sales Order data export.

Data Hooks:
The following hooks contain data obtained for each sales order record obtained from the Sales Orders data export and may be used the in data export fields.

  • {keySalesOrderID}: Key Identifier of the sales order
  • {salesOrderCode}: Code of the sales order
  • {keyCustomerAccountID}: Key identifier of the customer account
  • {customerAccountCode}: Code of the customer account
  • {createdDate}: date the sales order was created, in milliseconds since the 01/01/1970 12AM UTC epoch
  • {salesOrderIndex}: index of the sales order record being obtained.
  • {lineIndex} index of the sales order line record being obtained.
Sales Order Line Taxes Obtain a list of taxes for each sales order line obtained from the Sales Order Lines data export. The line tax data is placed within an ESDRecordOrderLineTax record and combined into a list of line tax records. The line tax records are then all placed into the ESDRecordOrderSaleLine record for each line record found in the Sales Order Lines data export.

Data Hooks:
The following hooks contain data obtained for each sales order record obtained from the Sales Orders data export and may be used the in data export fields.

  • {keySalesOrderID}: Key Identifier of the sales order
  • {salesOrderCode}: Code of the sales order
  • {keyCustomerAccountID}: Key identifier of the customer account
  • {customerAccountCode}: Code of the customer account
  • {createdDate}: date the sales order was created, in milliseconds since the 01/01/1970 12AM UTC epoch
  • {salesOrderIndex}: index of the sales order record being obtained.
  • {lineIndex} index of the sales order line record being obtained.
Sales Order Line Attribute Profiles

Obtain a list of attribute profiles for each sales order line obtained from the Sales Order Lines data export. The line attribute profile data is placed within an ESDRecordOrderLineAttributeProfile record and combined into a list of line attribute profile records. The line's attribute profile records are then all placed into the ESDRecordOrderSaleLine record for each line record found in the Sales Order Lines data export.

Data Hooks:
The following hooks contain data obtained for each sales order record obtained from the Sales Orders data export and may be used the in data export fields.

  • {keySalesOrderID}: Key Identifier of the sales order
  • {salesOrderCode}: Code of the sales order
  • {keyCustomerAccountID}: Key identifier of the customer account
  • {customerAccountCode}: Code of the customer account
  • {createdDate}: date the sales order was created, in milliseconds since the 01/01/1970 12AM UTC epoch
  • {salesOrderIndex}: index of the sales order record being obtained.
  • {lineIndex} index of the sales order line record being obtained.
Sales Order Line Attributes

Obtain a list of attribute for each sales order line's attribute profile obtained from the Sales Order Line Attribute Profiles data export. The line attribute data is placed within an ESDRecordOrderLineAttribute record and combined into a list of line attribute records. The line's attribute  records are then all placed into the ESDRecordOrderLineAttributeProfile record for each attribute profile record found in the Sales Order Attribute Profiles data export.

Data Hooks:
The following hooks contain data obtained for each sales order record obtained from the Sales Orders data export and may be used the in data export fields.

  • {keySalesOrderID}: Key Identifier of the sales order
  • {salesOrderCode}: Code of the sales order
  • {keyCustomerAccountID}: Key identifier of the customer account
  • {customerAccountCode}: Code of the customer account
  • {createdDate}: date the sales order was created, in milliseconds since the 01/01/1970 12AM UTC epoch
  • {salesOrderIndex}: index of the sales order record being obtained.
  • {lineIndex} index of the sales order line record being obtained.
Sales Order Payments

Obtain a list of payments for a single sales order obtained from the Sales Order data export. The line data is placed within an ESDRecordOrderPayment record and combined into a list of payment records. The payment records are then all placed into a ESDocumentOrderSale document for the sales order record obtained in the Sales Order data export.

Data Hooks:
The following hooks contain data obtained for each sales order record obtained from the Sales Orders data export and may be used the in data export fields.

  • {keySalesOrderID}: Key Identifier of the sales order
  • {salesOrderCode}: Code of the sales order
  • {keyCustomerAccountID}: Key identifier of the customer account
  • {customerAccountCode}: Code of the customer account
  • {createdDate}: date the sales order was created, in milliseconds since the 01/01/1970 12AM UTC epoch
  • {salesOrderIndex}: index of the sales order record being obtained.
  • {lineIndex} index of the sales order line record being obtained.
Sales Order Surcharges Obtain a list of surcharges for a single sales order obtained from the Sales Order data export. The surcharge data is placed within an ESDRecordOrderSurcharge record and combined into a list of surcharge records. The surcharge records are then all placed into a ESDocumentOrderSale document for the sales order record obtained in the Sales Order data export.

Data Hooks:
The following hooks contain data obtained for each sales order record obtained from the Sales Orders data export and may be used the in data export fields.

  • {keySalesOrderID}: Key Identifier of the sales order
  • {salesOrderCode}: Code of the sales order
  • {keyCustomerAccountID}: Key identifier of the customer account
  • {customerAccountCode}: Code of the customer account
  • {createdDate}: date the sales order was created, in milliseconds since the 01/01/1970 12AM UTC epoch
  • {salesOrderIndex}: index of the sales order record being obtained.
  • {lineIndex} index of the sales order line record being obtained.
Sales Order Surcharge Taxes

Obtain a list of taxes for each sales order surcharge obtained from the Sales Order Surcharges data export. The surcharge tax data is placed within an ESDRecordOrderLineTax record and combined into a list of surcharge tax records. The surcharge tax records are then all placed into the ESDRecordOrderSurcharge record for each surcharge record found in the Sales Order Surcharges data export.

Data Hooks:
The following hooks contain data obtained for each sales order record obtained from the Sales Orders data export and may be used the in data export fields.

  • {keySalesOrderID}: Key Identifier of the sales order
  • {salesOrderCode}: Code of the sales order
  • {keyCustomerAccountID}: Key identifier of the customer account
  • {customerAccountCode}: Code of the customer account
  • {createdDate}: date the sales order was created, in milliseconds since the 01/01/1970 12AM UTC epoch
  • {salesOrderIndex}: index of the sales order record being obtained.
  • {lineIndex} index of the sales order line record being obtained.
Sales Representatives Obtains a list of sales representatives. The data is placed within an ESDRecordSalesRep record when exported. The record data is all placed into an ESDocumentSalesRep document which is exported out in its final state to the Ecommerce system.
Sell Units Obtains a list of sell units. The data is placed within an ESDRecordSellUnit record when exported. The records are then all placed into a ESDocumentSellUnit document which is exported out in its final state to the Ecommerce system.
Supplier Accounts Obtains a list of supplier accounts. The data is placed within a ESDRecordSupplierAccount record when exported. The record data is all placed into a ESDocumentSupplierAccount document which is exported out in its final state to the Ecommerce system.
Supplier Account Addresses Obtains a list of addresses assigned to supplier accounts. The data is placed within a ESDRecordSupplierAccountAddress record when exported. The record data is all placed into a ESDocumentSupplierAccountAddress document which is exported out in its final state to the Ecommerce system.
Surcharges Obtains a list of surcharges. The data is placed within an ESDRecordSurcharge record when exported. The records are then all placed into a ESDocumentSurcharge document which is exported out in its final state to the Ecommerce system.
Taxcodes Obtains a list of taxcodes. The data is placed within an ESDRecordTaxcode record when exported. The records are then all placed into a ESDocumentTaxcode document which is exported out in its final state to the Ecommerce system.

Data Exports - Account Enquiry

The Generic adaptor has the following data exports listed in the table below that can only be called upon to retrieve records from external systems to the Connector. These exports allow for real time querying of records linked to customer or supplier accounts. They can be used to allow customers/staff/other people or software to search for and enquire about records in real time, such as looking up invoices or payments. The obtained data is converted into a Ecommerce Standard Document, and returned from the adaptor to the software calling the Connector for it to display or use the retrieved data.

JSON/XML/CSV Data Hooks

Any of the following data hooks may be embedded within any of the JSON/XML settings for all account enquiry based data exports list below, the data exports are assigned to a JSON/XML/CSV Web Service/File data source type. These hooks will get replaced with values allowing record searches to be set up within JSON/XML/CSV webservice/file requests:

  • {recordsPerPage}
    Number of records to retrieve for each page of records requested. This number is the same as set within the data export's assigned JSON/XML/CSV data source type.
  • {pageNumber}
    The current page number being processed by the data export. The page number starts from number 1.
  • {pageIndex}
    The current page number being processed by the data export, with the page number offset by -1. Eg. for page 1 this hook would return 0, for page 2 it returns 1.
    {pageOffset}
  • The current page number being processed by the data export multiplied by the amount of records per page. For example if each page was configured in the export's assigned JSON/XML/CSV data source type to contain 100 records, then when the export is called to retrieve the 1st page of data this hooks value would be 0, then 100, then 200 ((page 3 - 1) * 100 records per page).
  • {lastKeyRecordID}
    Gets the key identifier of the last record that was retrieved by the data export. This is useful if pages of records are paginated not by number, but based on alphabetical order, using the key (unique identifer) of the record as the starting basis to retrieve the next page of records.
  • {lastKeyRecordIDXMLencoded}
    Same value as {lastKeyRecordID}, except that the value has been XML encoded to allow it to be safely embedded within XML or HTML data without breaking the structure of such documents.
  • {lastKeyRecordIDURIencoded}
    Same value as {lastKeyRecordID}, except that the value has been URI encoded to allow it to be safely embedded within URLs (avoiding breaking the syntax of a URL).

Request Data Hooks

The following JSON/XML/CSV data hooks can be configured to display values as configured from within the data export field mappings. This hooks can be embedded within any of the JSON/XML/CSV settings for account enquiry based data exports list below. These hooks allow the HTTP or file based requests to be modified based on the type of request being made to the Connector. For example the {requestFilterClause1} hook may be placed within the Data Export URL, and in its field mapping use an IF function to change the URL based on if a single record needs to be found using a different web service endpoint, compared to an web service endpoint that may be able to find multiple records.

  • {requestFilterClause1}
  • {requestFilterClause2}
  • {requestFilterClause3}
  • {requestFilterClause4}
  • {requestFilterClause5}

Each of these hooks can reference the following JSON data (as named below) when obtaining records in the original request, as well as use JSON field functions to manipulate the data coming from the data export request.

  • keyCustomerAccountID
    Unique identifier of the customer account that records are to be obtained for.
    keySupplierAccountID
    Unique identifier of the supplier account that records are to be obtained for.
  • recordType
    Type of records that are being searched on. eg. INVOICE, ORDER_SALE, BACKORDER, CREDIT, PAYMENT, QUOTE
  • beginDate1
    The earliest date to obtain records from. This value is set as a long integer containing the number of milliseconds since the 01/01/1970 12AM UTC epoch A.K.A. unix time.
  • endDate1
    The soonest date to obtain records from. This value is set as a long integer containing the number of milliseconds since the 01/01/1970 12AM UTC epoch A.K.A. unix time.
  • pageNumber1
    The current page number being processed by the data export. The page number starts from number 1.
  • numberOfRecords1
    Number of records to retrieve for each page of records requested. This number is the same as set within the data export's assigned JSON/XML/CSV data source type.
  • outstandingRecords1
    Either Y or N. If Y then only return records that are outstanding or unpaid.
  • searchString1
    Text data to search for and match records on. The field(s) to search on is dictated by the value set in the search type, or else a default field.
  • keyRecordIDs1
    List of key identifiers of records to match on. The list is a comma delimited list with each KeyRecordID value URI encoded.
  • searchType1
    Type of search to perform to match records on. For example. it could be set to invoiceNumber to search for invoice records that match a given invoice number.
  • keyRecordID2
    Unique key identifier of a single record to retrieve.
  • dataSourceSessionID
    ID of the session retrieved by the Data Source Session data export. Stores the ID of the session that may be required to be able to access and retrieve data from external systems.

The data fields marked with 1 are only available to data exports that perform searches for the details of a record.
The data fields marked with 2 are only available to data exports that retrieve the details of a single record's line.

Data Export Definitions

Data Export Description
Customer Account Record Back Order Lines

Obtains a list of lines for a single back order assigned to a customer account. The data is placed within  a ESDRecordCustomerAccountEnquiryBackOrderLine record when exported. Each of the back order line records is then placed into a ESDRecordCustomerAccountEnquiryBackOrder record  The record data is placed into a ESDocumentCustomerAccountEnquiry document which is exported out in its final state to the Ecommerce system.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries, based on the connecting Ecommerce system's needs.

{recordType} - type of customer account record being searched for.
{keyRecordID} - Unique ID of the back order record.
{whereClause} - Contains conditions on how the record is obtained.
{keyCustomerAccountID} - Unique ID of the customer account.

Customer Account Record Back Orders

Obtains a list of back orders assigned to single customer account, with records found based on matching a number of search factors such as date range, or record ID. The data is placed within a ESDRecordCustomerAccountEnquiryBackOrder record when exported. The record data is all placed into a ESDocumentCustomerAccountEnquiry document which is exported out in its final state to the Ecommerce system.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries, based on the connecting Ecommerce system's needs.

{whereClause} - Contains a number of conditions to control how record searches occur, which matches record dates based on unix timestamp long integer representations of dates.
{whereClauseDateStringMatching} - Contains a number of conditions to control how record searches occur with matches on, including matching records by date based on date formatted string representations of dates.

{orderByClause} - Controls how records are ordered.
{limitClause} - Limits how many records are returned in the record search.
{keyCustomerAccountID} - Unique ID of the customer account
{recordType} - Type of customer account record being searched for.
{pageNumber} - The page number to obtain records for.
{numberOfRecords}- The number of records to return per page.

Customer Account Record Credit Lines

Obtains a list of lines for a single credit assigned to a customer account. The data is placed within  a ESDRecordCustomerAccountEnquiryCreditLine record when exported. Each of the credit line records is then placed into a ESDRecordCustomerAccountEnquiryCredit record  The record data is placed into a ESDocumentCustomerAccountEnquiry document which is exported out in its final state to the Ecommerce system.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries, based on the connecting Ecommerce system's needs.

{recordType} - type of customer account record being searched for.
{keyRecordID} - Unique ID of the credit record.
{whereClause} - Contains conditions on how the record is obtained.
{keyCustomerAccountID} - Unique ID of the customer account.

Customer Account Record Credits

Obtains a list of credits assigned to single customer account, with records found based on matching a number of search factors such as date range, or record ID. The data is placed within a ESDRecordCustomerAccountEnquiryCredit record when exported. The record data is all placed into a ESDocumentCustomerAccountEnquiry document which is exported out in its final state to the Ecommerce system.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries, based on the connecting Ecommerce system's needs.

{whereClause} - Contains a number of conditions to control how record searches occur, which matches record dates based on unix timestamp long integer representations of dates.
{whereClauseDateStringMatching} - Contains a number of conditions to control how record searches occur with matches on, including matching records by date based on date formatted string representations of dates.

{orderByClause} - Controls how records are ordered.
{limitClause} - Limits how many records are returned in the record search.
{keyCustomerAccountID} - Unique ID of the customer account
{recordType} - Type of customer account record being searched for.
{pageNumber} - The page number to obtain records for.
{numberOfRecords}- The number of records to return per page.

Customer Account Record Invoice Lines

Obtains a list of lines for a single invoice assigned to a customer account. The data is placed within  a ESDRecordCustomerAccountEnquiryInvoiceLine record when exported. Each of the invoice line records is then placed into a ESDRecordCustomerAccountEnquiryInvoice record  The record data is placed into a ESDocumentCustomerAccountEnquiry document which is exported out in its final state to the Ecommerce system.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries, based on the connecting Ecommerce system's needs.

{recordType} - type of customer account record being searched for.
{keyRecordID} - Unique ID of the invoice record.
{whereClause} - Contains conditions on how the record is obtained.
{keyCustomerAccountID} - Unique ID of the customer account.

Customer Account Record Invoices

Obtains a list of invoices assigned to single customer account, with records found based on matching a number of search factors such as date range, or record ID. The data is placed within a ESDRecordCustomerAccountEnquiryInvoice record when exported. The record data is all placed into a ESDocumentCustomerAccountEnquiry document which is exported out in its final state to the Ecommerce system.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries, based on the connecting Ecommerce system's needs.

{whereClause} - Contains a number of conditions to control how record searches occur, which matches record dates based on unix timestamp long integer representations of dates.
{whereClauseDateStringMatching} - Contains a number of conditions to control how record searches occur with matches on, including matching records by date based on date formatted string representations of dates.

{orderByClause} - Controls how records are ordered.
{limitClause} - Limits how many records are returned in the record search.
{keyCustomerAccountID} - Unique ID of the customer account
{recordType} - Type of customer account record being searched for.
{pageNumber} - The page number to obtain records for.
{numberOfRecords}- The number of records to return per page.

Customer Account Record Payment Lines

Obtains a list of lines for a single payment assigned to a customer account. The data is placed within  a ESDRecordCustomerAccountEnquiryPaymentLine record when exported. Each of the payment line records is then placed into a ESDRecordCustomerAccountEnquiryPayment record  The record data is placed into a ESDocumentCustomerAccountEnquiry document which is exported out in its final state to the Ecommerce system.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries, based on the connecting Ecommerce system's needs.

{recordType} - type of customer account record being searched for.
{keyRecordID} - Unique ID of the payment record.
{whereClause} - Contains conditions on how the record is obtained.
{keyCustomerAccountID} - Unique ID of the customer account.

Customer Account Record Payments

Obtains a list of payments assigned to single customer account, with records found based on matching a number of search factors such as date range, or record ID. The data is placed within a ESDRecordCustomerAccountEnquiryPayment record when exported. The record data is all placed into a ESDocumentCustomerAccountEnquiry document which is exported out in its final state to the Ecommerce system.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries, based on the connecting Ecommerce system's needs.

{whereClause} - Contains a number of conditions to control how record searches occur, which matches record dates based on unix timestamp long integer representations of dates.
{whereClauseDateStringMatching} - Contains a number of conditions to control how record searches occur with matches on, including matching records by date based on date formatted string representations of dates.

{orderByClause} - Controls how records are ordered.
{limitClause} - Limits how many records are returned in the record search.
{keyCustomerAccountID} - Unique ID of the customer account
{recordType} - Type of customer account record being searched for.
{pageNumber} - The page number to obtain records for.
{numberOfRecords}- The number of records to return per page.

Customer Account Record Quote Lines

Obtains a list of lines for a single quote assigned to a customer account. The data is placed within a ESDRecordCustomerAccountEnquiryQuoteLine record when exported. Each of the quote line records is then placed into a ESDRecordCustomerAccountEnquiryQuote record  The record data is placed into a ESDocumentCustomerAccountEnquiry document which is exported out in its final state to the Ecommerce system.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries, based on the connecting Ecommerce system's needs.

{recordType} - type of customer account record being searched for.
{keyRecordID} - Unique ID of the quote record.
{whereClause} - Contains conditions on how the record is obtained.
{keyCustomerAccountID} - Unique ID of the customer account.

Customer Account Record Quotes

Obtains a list of quotes assigned to single customer account, with records found based on matching a number of search factors such as date range, or record ID. The data is placed within a ESDRecordCustomerAccountEnquiryQuote record when exported. The record data is all placed into a ESDocumentCustomerAccountEnquiry document which is exported out in its final state to the Ecommerce system, based on the connecting Ecommerce system's needs.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries.

{whereClause} - Contains a number of conditions to control how record searches occur, which matches record dates based on unix timestamp long integer representations of dates.
{whereClauseDateStringMatching} - Contains a number of conditions to control how record searches occur with matches on, including matching records by date based on date formatted string representations of dates.

{orderByClause} - Controls how records are ordered.
{limitClause} - Limits how many records are returned in the record search.
{keyCustomerAccountID} - Unique ID of the customer account
{recordType} - Type of customer account record being searched for.
{pageNumber} - The page number to obtain records for.
{numberOfRecords}- The number of records to return per page.

Customer Account Record Sales Order Lines

Obtains a list of lines for a single sales order assigned to a customer account. The data is placed within a ESDRecordCustomerAccountEnquiryOrderSaleLine record when exported. Each of the sales order line records is then placed into a ESDRecordCustomerAccountEnquiryOrderSale record  The record data is placed into a ESDocumentCustomerAccountEnquiry document which is exported out in its final state to the Ecommerce system.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries, based on the connecting Ecommerce system's needs.

{recordType} - type of customer account record being searched for.
{keyRecordID} - Unique ID of the sales order record.
{whereClause} - Contains conditions on how the record is obtained.
{keyCustomerAccountID} - Unique ID of the customer account.

Customer Account Record Sales Orders

Obtains a list of sales orders assigned to single customer account, with records found based on matching a number of search factors such as date range, or record ID. The data is placed within a ESDRecordCustomerAccountEnquiryOrderSale record when exported. The record data is all placed into a ESDocumentCustomerAccountEnquiry document which is exported out in its final state to the Ecommerce system, based on the connecting Ecommerce system's needs.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries.

{whereClause} - Contains a number of conditions to control how record searches occur, which matches record dates based on unix timestamp long integer representations of dates.
{whereClauseDateStringMatching} - Contains a number of conditions to control how record searches occur with matches on, including matching records by date based on date formatted string representations of dates.

{orderByClause} - Controls how records are ordered.
{limitClause} - Limits how many records are returned in the record search.
{keyCustomerAccountID} - Unique ID of the customer account
{recordType} - Type of customer account record being searched for.
{pageNumber} - The page number to obtain records for.
{numberOfRecords}- The number of records to return per page.

Customer Account Record Transactions Obtains a list of transactions assigned to single customer account, with records found based on matching a number of search factors such as date range. The data is placed within a ESDRecordCustomerAccountEnquiryTransaction record when exported. The record data is all placed into a ESDocumentCustomerAccountEnquiry document which is exported out in its final state to the Ecommerce system.
Customer Account Report Invoice Lines

Obtains a list of lines for one or more invoices assigned to a customer account. This can be used to create reports and aggreate invoice line data for show in Ecommerce systems.
The data is placed within a ESDRecordCustomerAccountEnquiryOrderInvoiceLine record when exported. Each of the payment line records is then placed into a ESDRecordCustomerAccountEnquiryInvoice record  The record data is placed into a ESDocumentCustomerAccountEnquiry document which is exported out in its final state to the Ecommerce system.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries.

{limitClause} - Limits how many records are returned in the record search.
{keyCustomerAccountID} - Unique ID of the customer account
{reportID} - ID of the report matching configured report within the adaptor

{recordType} - Type of customer account record being searched for.
{pageNumber} - The page number to obtain records for.
{numberOfRecords}- The number of records to return per page.
{orderByField} - Sets the field that records are sorted by.
{orderByClause} - Sets the clause used to order records by

Customer Account Status

Obtains the status of a single specific customer account. The data is placed within a ESDRecordCustomerAccount record when exported. The record data is all placed into a ESDocumentCustomerAccount document which is exported out in its final state to the Ecommerce system.

This export is not recommended to be set to CSV spreadsheet data sources in a product environment that contain a large number of records, due to the long time that it can take to iterate over records in a CSV file to locate a specific customer account record.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries.

{keyCustomerAccountID} - Unique ID of the customer
{checkOnHold} - Has either the values 'Y or 'N'. If Y indicates that a check should be made to see if the account is on hold (customer should not be allowed to purchase more products).
{checkBalance} - Has either the values 'Y or 'N'. If Y indicates that a check should be made to see if the account's balance is within its limits.
{checkTerms} - Has either the values 'Y or 'N'. If Y indicates that a check should be made to see if the account has paid for all purchases within its payment terms.

JSON/XML/CSV Data Hooks

Any of the following data hooks may be embedded within the Data Export URL, which will get replaced with values allowing record searches to be filtered when requesting data.

{keyCustomerAccountID} - Unique ID of the customer account, with its value URL encoded.
{keyCustomerAccountIDRaw} - Unique ID of the customer account, with its value not encoded.
{checkOnHold} - Has either the values 'Y or 'N'. If Y indicates that a check should be made to see if the account is on hold (customer should not be allowed to purchase more products).
{checkBalance} - Has either the values 'Y or 'N'. If Y indicates that a check should be made to see if the account's balance is within its limits.
{checkTerms} - Has either the values 'Y or 'N'. If Y indicates that a check should be made to see if the account has paid for all purchases within its payment terms.

Supplier Account Record Purchase Orders

Obtains a list of purchase orders assigned to single supplier account. The data is placed within  a ESDRecordSupplierAccountEnquiryPurchaseOrder record when exported. The record data is all placed into a ESDocumentSupplierAccountEnquiry document which is exported out in its final state to the Ecommerce system.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries.

{whereClause} - Contains a number of conditions to control how record searches occur, which matches record dates based on unix timestamp long integer representations of dates.
{whereClauseDateStringMatching} - Contains a number of conditions to control how record searches occur with matches on, including matching records by date based on date formatted string representations of dates.

{orderByClause} - Controls how records are ordered.
{limitClause} - Limits how many records are returned in the record search.
{keySupplierAccountID} - Unique ID of the supplier account
{recordType} - Type of supplier account record being searched for.
{pageNumber} - The page number to obtain records for.
{numberOfRecords}- The number of records to return per page.

Supplier Account Record Purchase Order Lines

Obtains a list of lines for a single purchase assigned to a supplier account. The data is placed within a ESDRecordSupplierAccountEnquiryOrderPurchaseLine record when exported. Each of the purchase order line records is then placed into a ESDRecordSupplierAccountEnquiryOrderPurchase record  The record data is placed into a ESDocumentSupplierAccountEnquiry document which is exported out in its final state to the Ecommerce system.

ODBC/SQL Data Hooks

Any of the following data hooks may be embedded within the data table data export fields, which will get replaced with values allowing record searches to be set up within SQL queries, based on the connecting Ecommerce system's needs.

{recordType} - type of supplier account record being searched for.
{keyRecordID} - Unique ID of the purchase order record.
{whereClause} - Contains conditions on how the record is obtained.
{keySupplierAccountID} - Unique ID of the supplier account.

Data Imports

The Generic adaptor has the following data imports listed in the table below that can be configured to write data to chosen ODBC database table(s), MS SQL Server database table(s), CSV spreadsheet file(s), ESD supported web service or JSON/XML/CSV based files or web services. The data is first converted from a Ecommerce Standard Document that has been stored or sent through to the Connector, and imported from the adaptor into the configured data source type.

Spreadsheet CSV Data Imports

For CSV spreadsheet data sources each data import needs to be configured to map the column of the Ecommerce Standards Document record field to the name of a column that will be placed within the generated CSV file. Some data imports will have their data joined together when the data is converted from the standards objects, and written to a CSV file is separate rows. Each data import can be configured to write to one or more CSV files. For each data import by placing a different file name in the "CSV Text File Name Matcher" filename field, it allows multiple files to be written for a single type of record, such as a sales order, purchase order or customer account payment. The imports can be configured to write CSV data with any file extension, not just CSV files. The data source also supports writing files that use different character delimiters to separate data, such as tabs or spaces. There are data source settings where you get to define the characters that delimit field data, as well as rows. Additionally there are settings that allow you to specify the characters that can escape a character that would normally be used to delimit data, or rows.

SQL Database Based Data Imports

For ODBC, Microsoft SQL Server, and SQLite database data sources each data import needs to be configured to write data to a specified database table, and map the fields in the table/views from the Ecommerce Standards Document record fields. For the sales orders, purchase orders, and customer account payment data imports, because they use multiple data imports to write all of the records data, the data can be written to different database tables. In the Data Import's "Data Table Settings" tab the "Get New Record ID Query" setting allows you to specify a SQL query that will be called after data is written to a database table to obtain the ID of the new row created in the database. This value is then made available to other associated data imports to import data with. This is useful when needing to link one database table record to another, such as when lines for a sales order are stored in a different database table to the sales order, and the lines need to store the unique identifier to the sales order.

Ecommerce Standards Documents Webservice Data Imports

For the ESD webservice data sources the exports are configured to send a HTTP post request to a compatible webservice containing the data in a native ESD serialized JSON format. The ESD web service will then expect a returned ESD Document in the HTTP response body data confirming that the data successfully imported, along with a HTTP 200-OK response code. Some of the data imports listed below are not required to be configured to an ESD data source since the data import's settings are not required to post the relevant data.

For JSON/XML/CSV - Webservice/File Data Imports

For the JSON/XML/CSV webservice or file based data imports they first need a data source type to be set up so that the adaptor knows whether to make a HTTP request to send the JSON or XML data through a computer network or internet location, or otherwise write to a file stored on the file system. This is done by setting to to Protocol of the data source to either "http://", "https://", or "file:///". For HTTP connections the domain and header information will need to be set to allow connections to be made to a webservice. For each data import you can then set the relative URL path to call the most appropriate endpoint that will be used to import data. When setting up the data source you will need to specify if XML or JSON data should be expected to be sent to the web service or local file. You may also need to set up the Data Source Session Create and Data Source Session Destroy data exports if the JSON or XML data is being set to a web service that requires a session, login or token to be created first before being able to import data. See the Using Sessions When Calling JSON/XML/CSV Web Services section on this page for more details.

In the Data Import's "JSON/XML/CSV" Settings tab the "Data Import URL" setting allows you to set the name of the web service endpoint, or file name where the data is to be imported to. This URL will be appended to the Protocol and Host/IP Address settings set in the data source settings. Once a final URL or file path is built by the adaptor, it will then be used to send the JSON or XML data to. The Data Import URL only needs to be set for the Sales Orders, Purchase Orders and Customer Account Payments data imports. This setting will be ignored for all other data imports since they are associated with these fore-mentioned primary data imports. In the "Data Import URL" if the data source's Protocol setting is set to "file:///" then a data hook labelled {#} can be set in the Data Import URL which will be replaced by the unique ID of the record data saved to the Connector. This can be used to ensure that files are not overwritten when the Connector imports for data. For example in the Sales Orders data import, if the Data Import URL is set to "sales_order_{#}.json" then this allows sales order files to be saved to the file system as sales_order_3.json, sales_order_4.json, sales order_6.json etc...

In the Data Import's "JSON/XML/CSV" Settings tab the "Data Import Content" setting is where the layout and structure of the JSON data needs to be specified. Any JSON data can be placed within this setting. Any data fields from the record data being imported can have data hooks embedded within the Data Import Content settting. Each of the data hooks is named in the following format {ESDDataFieldwhere the name of the ESD Data Field as listed in the Data Fields tab can be set within the curly brackets. For example in the Sales Orders data import the data hook {billingAddress1} can be used to place the 1st billing address field into Data Import Content JSON. Any data hooks that store text values will automatically have its value escape, ensuring that if the field data contains double quote characters, new lines, carriage returns or tab characters, that these characters will be correctly escaped. Within the Data Fields tab in the Custom Data Field Definition column you need to specify the name of data field that will be outputted for the data hook. In the Data Field Definition you also have ability to place functions that allow traversal through the JSON data structure to obtain the required data, as well as allow the data to be manipulated before being placed into its data hook. If a data field in not set as Active then its data hook will output an empty value. It's recommended to not activate any Data Fields that don't have data hooks embedded within the Data Import Content setting. This will reduce time to generate the JSON document.

IF XML data is being sent to either the web service or written to a local file then the data will first be created as JSON data. After the JSON data has been successfully been created from all associated data imports, then the adaptor will parse and convert the JSON data into XML data, based on using Newtonsoft's JSON to XML open source converter library. The JSON data must contain the XML version and and root elements for it to successfully be converted into an XML document. If the JSON data is not being converted into XML before being imported then it will not be parsed and its up to you to set syntactically correct JSON.

Data Import Definitions

Data Import Description
Customer Account Payments

Imports a payment applied against a customer account. The payment data is obtained from an ESDRecordCustomerAccountPayment record when imported. The data is joined with the Customer Account Payment Records data import to allow both the customer account payment to be imported, as well as a list of records that the payment is applied against (such as invoice or sales order records).

This data import will be called before the Customer Account Payment Records data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will be call the Customer Account Payment Records data import first to the obtain payment record data to allow it to be embedded.

For the JSON/XML/CSV - Web Service/File data source type if its protocol is set to HTTP or HTTPS then the adaptor will expect the web service that the data is being sent to return a HTTP response code of 200-OK, 201-Created, or 202-Accepted. If the Data Import Result Matcher setting has a regular expression set then it will be used to match against the body of the HTTP response. If the regular expression matches or is left empty then the payment will be marked as successfully imported.

JSON/XML/CSV - Web Service/File Data Hooks

The following data hook may be embedded within the Data Import Content field, which will get replaced with its value.

{paymentRecords} - contains the collated set of JSON records from each of the Customer Account Payment Records data imports called for each of the payment records assigned to the payment. Each payment record is separated by a comma character. The hook will be empty if the Customer Account Payment Records data import is not assigned to the same data source as this data import.

Customer Account Payment Records

Imports a record applied against a customer account payment, such as a paid amount against an invoice or sales order record. The payment record data is obtained from an ESDRecordCustomerAccountPaymentRecord record when imported. The data is joined with the Customer Account Payments data import to allow both the customer account payment to be imported, as well as a list of records that the payment is applied against.

This data import will only be called if records are assigned to a Customer Account Payment being imported. This data import will be called for each record assigned to a payment after the Customer Account Payments data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Customer Account Payment Records data import first to the obtain payment record data to allow it to be embedded.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Customer Account Payments data import otherwise it won't be called.

Customer Account Payment Surcharges

Imports a surcharge applied against a customer account payment, such as a credit card surcharge fee. The surcharge data is obtained from an ESDRecordCustomerAccountPaymentSurcharge record when imported. The data is joined with the Customer Account Payments data import to allow both the customer account payment to be imported, as well as a list of surcharges that have been included in the payment.

This data import will only be called if surcharges are assigned to a Customer Account Payment being imported. This data import will be called for each surcharge assigned to a payment after the Customer Account Payments data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Customer Account Payment Surcharges data import first to the obtain payment surcharge data to allow it to be embedded.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Customer Account Payments data import otherwise it won't be called.

Customer Account Payment Surcharge Taxes

Imports a tax applied to a surcharge, for a surcharge that has been applied against a customer account payment, such as if a Goods and Services tax is applied to a credit card surcharge payment fee. The surcharge tax data is obtained from an ESDRecordCustomerAccountPaymentSurchargeTax record when imported. The data is joined with the Customer Account Payment Surcharges data import to allow both the customer account payment surcharge to be imported, as well as a list of surcharge taxes that have been included in the payment.

This data import will only be called if surcharges are assigned to a Customer Account Payment being imported, and surcharge contains a tax applied to it. This data import will be called for each surcharge  tax assigned to a payment surcharge after the Customer Account Payments and Customer Account Payment Surcharges data imports for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Customer Account Payment Surcharge Taxes data import first to the obtain payment surcharge tax data to allow it to be embedded.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Customer Account Payments data import otherwise it won't be called.

Purchase Orders

Imports a purchase order. The purchase order data is obtained from an ESDRecordOrderPurchase record when imported. The data is joined with the data generated from the following data imports:

  • Purchase Order Lines
  • Purchase Order Line Taxes
  • Purchase Order Line Attribute Profiles
  • Purchase Order Line Attributes
  • Purchase Order Product Deliveries
  • Purchase Order Payments
  • Purchase Order Surcharges
  • Purchase Order Surcharge Taxes.

The Purchase Orders data import will be called before all the other purchase order data imports for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will be call each of the dependent data imports first before calling the Purchase Orders data import last.

For the JSON/XML/CSV - Web Service/File data source type if its protocol is set to HTTP or HTTPS then the adaptor will expect the web service that the data is being sent to return a HTTP response code of 200-OK, 201-Created, or 202-Accepted. If the Data Import Result Matcher setting has a regular expression set then it will be used to match against the body of the HTTP response. If the regular expression matches or is left empty then the order will be marked as successfully imported.

JSON/XML/CSV - Web Service/File Data Hooks

Any of the following data hooks may be embedded within the Data Import Content field, which will get replaced with values at the time the data import is called.

{lines} - contains the collated set of JSON records from each of the Purchase Order Lines data imports called for each of the line records assigned to the purchase order. Each line record outputted is separated by a comma character. The hook will be empty if the Purchase Order Lines data import is not assigned to the same data source as this data import.

{surcharges} - contains the collated set of JSON records from each of the Purchase Order Surcharges data imports called for each of the surcharge records assigned to the purchase order. Each surcharge record outputted is separated by a comma character. The hook will be empty if the Purchase Order Surcharges data import is not assigned to the same data source as this data import.

{payments} - contains the collated set of JSON records from each of the Purchase Order Payments data imports called for each of the payments assigned to the purchase order. Each payment record outputted is separated by a comma character. The hook will be empty if the Purchase Order Payments data import is not assigned to the same data source as this data import.

Purchase Order Lines

Imports a line set against a purchase order. The line could represent a quantity of a product, download, or labour being purchased. The line record data is obtained from an ESDRecordOrderPurchaseLine record when imported. The data is joined with the following data imports:

  • Purchase Orders
  • Purchase Order Line Taxes
  • Purchase Order Line Product Deliveries
  • Purchase Order Attribute Profiles

This data import will only be called if there are lines assigned to a Purchase Order being imported. This data import will be called for each line record assigned to the purchase order after the Purchase Orders data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Purchase Order Lines data import before the Purchase Orders data import. The Purchase Order Lines Taxes, Purchase Order Line Attribute Profiles, and Purchase Order Product Deliveries will all be called before calling the Purchase Order Lines data import for the JSON/XML/CSV data type.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Purchase Orders data import otherwise it won't be called.

JSON/XML/CSV - Web Service/File Data Hooks

Any of the following data hooks may be embedded within the Data Import Content field, which will get replaced with values at the time the data import is called.

{taxes} - contains the collated set of JSON records from each of the Purchase Order Line Taxes data imports called for each of the line tax records assigned to a purchase order line. Each line tax record outputted is separated by a comma character. The hook will be empty if the Purchase Order Line Taxes data import is not assigned to the same data source as this data import.

{productDeliveries} - contains the collated set of JSON records from each of the Purchase Order Line Product Deliveries data imports called for each of the product delivery records assigned to the purchase order line. Each product delivery record outputted is separated by a comma character. The hook will be empty if the Purchase Order Line Taxes data import is not assigned to the same data source as this data import.

{attributes} - contains the collated set of JSON records from each of the Purchase Order Line Attribute Profiles data imports called for each of the attribute profiles assigned to the purchase order line. Each attribute profile record outputted is separated by a comma character. The hook will be empty if the Purchase Order Line Attribute Profiles data import is not assigned to the same data source as this data import.

Purchase Order Text Lines

Imports a text line set against a purchase order. The text line contains text providing details or instructions about the order. an ESDRecordOrderPurchaseLine record when imported. The data is joined with the Purchase Orders data import.

This data import will only be called if there are text lines assigned to a Purchase Order being imported. This data import will be called for each text line record assigned to the purchase order after the Purchase Orders data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Purchase Order Text Lines data import before the Purchase Orders data import.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Purchase Orders data import otherwise it won't be called.

Purchase Order Line Taxes

Imports a tax applied against a purchase order line. The line tax defines the kind of tax and monetary amount applied against the line. The line tax record data is obtained from an ESDRecordOrderLineTax record when imported. The data is joined with the Purchase Order Lines data imports to allow both the taxes applied against the purchase order line, and the purchase order line to be imported.

This data import will only be called if there are taxes assigned to a Purchase Order Lines being imported. This data import will be called for each tax record assigned to the purchase order line after the Purchase Order Lines data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Purchase Order Lines Taxes data import before the Purchase Orders Lines data import.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Purchase Order Lines data import otherwise it won't be called.

Purchase Order Line Attribute Profiles

Imports an attribute profile assigned against a purchase order line. The attribute profile defines how attributes are grouped against the line. The attribute profile record data is obtained from an ESDRecordOrderLineAttributeProfile record when imported. The data is joined with the Purchase Order Lines data imports to allow both the attributes assigned against the purchase order line, and the purchase order line to be imported.

This data import will only be called if there are attribute profiles assigned to a Purchase Order Lines being imported. This data import will be called for each attribute profile record assigned to the purchase order line after the Purchase Order Lines data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Purchase Order Line Attribute Profiles data import before the Purchase Orders Lines data import.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Purchase Order Lines data import otherwise it won't be called.

JSON/XML/CSV - Web Service/File Data Hooks

The following data hooks may be embedded within the Data Import Content field, which will get replaced with values at the time the data import is called.

{values} - contains the collated set of JSON records from each of the Purchase Order Line Attributes data imports called for each of the line attribute records assigned to a purchase order line. Each attribute record outputted is separated by a comma character. The hook will be empty if the Purchase Order Line Attributes data import is not assigned to the same data source as this data import.

Purchase Order Line Attributes

Imports an attribute assigned against a purchase order line's profile. The attribute defines a field and value assigned against an order line, such as a comment, or other data. The attribute record data is obtained from an ESDRecordOrderLineAttribute record when imported. The data is joined with the Purchase Order Line Attribute Profiles data import to allow both the attributes assigned against the purchase order line, and the purchase order line attribute profile to be imported.

This data import will only be called if there are attributes assigned to a Purchase Order Lines being imported. This data import will be called for each attribute record assigned to a purchase order line's attribute profile after the Purchase Order Line Attribute Profiles data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Purchase Order Line Attributes data import before the Purchase Orders Line Attribute Profiles data import.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Purchase Order Line Attribute Profiles data import otherwise it won't be called.

Purchase Order Product Deliveries

Imports a product delivery associated to a purchase order line. The product delivery indicates the quantity of product ordered that has been delivered from the supplier. The product deliveries record data is obtained from an ESDRecordOrderProductDelivery record when imported. The data is joined with the Purchase Order Lines data imports to allow the product deliveries applied recorded against the purchase order line to be imported.

This data import will only be called if there are product deliveries assigned to a Purchase Order Lines being imported. This data import will be called for each delivery record assigned to the purchase order line after the Purchase Order Lines data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Purchase Order Product Deliveries data import before the Purchase Orders Lines data import.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Purchase Order Lines data import otherwise it won't be called.

Purchase Order Payments

Imports a payment assigned against a purchase order. Each payment records the monetary amount that has been paid against the purchase order. The payment record data is obtained from an ESDRecordOrderPayment record when imported. The data is joined with the Purchase Orders data import to allow both the payments recorded against the purchase order to be imported.

This data import will only be called if there are payments assigned to a Purchase Order being imported. This data import will be called for each payment record assigned to the purchase order after the Purchase Orders data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Purchase Order Payments data import before the Purchase Orders data import.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Purchase Orders data import otherwise it won't be called.

Purchase Order Surcharges

Imports a surcharge applied against a purchase order. Each surcharge indicates the monetary fee being charged to the order, such as for freight, or minimum order surcharges. The surcharge record data is obtained from an ESDRecordOrderSurcharge record when imported. The data is joined with the Purchase Orders data import to allow both the surcharges set against the purchase order to be imported.

This data import will only be called if there are surcharges assigned to a Purchase Order being imported. This data import will be called for each surcharge record assigned to the purchase order after the Purchase Orders data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Purchase Order Surcharges data import before the Purchase Orders data import.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Purchase Orders data import otherwise it won't be called.

JSON/XML/CSV - Web Service/File Data Hooks

The following data hook may be embedded within the Data Import Content field, which will get replaced with values at the time the data import is called.

{taxes} - contains the collated set of JSON records from each of the Purchase Order Surcharge Taxes data imports called for each of the surcharge tax records assigned to a purchase order surcharge. Each surcharge tax record outputted is separated by a comma character. The hook will be empty if the Purchase Order Surcharge Taxes data import is not assigned to the same data source as this data import.

Purchase Order Surcharge Taxes

Imports a tax applied against a purchase order surcharge. The surcharge tax defines the kind of tax and monetary amount applied against the surcharge. The surcharge tax record data is obtained from an ESDRecordOrderLineTax record when imported. The data is joined with the Purchase Order Surcharges data imports to allow both the taxes applied against the purchase order surcharge to be imported.

This data import will only be called if there are taxes assigned to a Purchase Order Surcharges being imported. This data import will be called for each tax record assigned to the purchase order surcharge after the Purchase Order Surcharges data import for all data source types except for the JSON/XML/cSV - Web Service/File data source type, where it will call the Purchase Order Surcharge Taxes data import before the Purchase Orders Surcharges data import.

For the JSON/XML/cSV - Web Service/File data source type it must be assigned to the same data source as the Purchase Order Surcharges data import otherwise it won't be called.

Sales Orders

Imports a sales order. The sales order data is obtained from an ESDRecordOrderSale record when imported. The data is joined with the data generated from the following data imports:

  • Sales Order Lines
  • Sales Order Line Taxes
  • Sales Order Line Attribute Profiles
  • Sales Order Line Attributes
  • Sales Order Product Deliveries
  • Sales Order Payments
  • Sales Order Surcharges
  • Sales Order Surcharge Taxes.

The Sales Orders data import will be called before all the other sales order data imports for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will be call each of the dependent data imports first before calling the Sales Orders data import last.

For the JSON/XML/CSV - Web Service/File data source type if its protocol is set to HTTP or HTTPS then the adaptor will expect the web service that the data is being sent to return a HTTP response code of 200-OK, 201-Created, or 202-Accepted. If the Data Import Result Matcher setting has a regular expression set then it will be used to match against the body of the HTTP response. If the regular expression matches or is left empty then the order will be marked as successfully imported.

JSON/XML/CSV - Web Service/File Data Hooks

Any of the following data hooks may be embedded within the Data Import Content field, which will get replaced with values at the time the data import is called.

{lines} - contains the collated set of JSON records from each of the Sales Order Lines data imports called for each of the line records assigned to the sales order. Each line record outputted is separated by a comma character. The hook will be empty if the Sales Order Lines data import is not assigned to the same data source as this data import.

{surcharges} - contains the collated set of JSON records from each of the Sales Order Surcharges data imports called for each of the surcharge records assigned to the sales order. Each surcharge record outputted is separated by a comma character. The hook will be empty if the Sales Order Surcharges data import is not assigned to the same data source as this data import.

{payments} - contains the collated set of JSON records from each of the Sales Order Payments data imports called for each of the payments assigned to the sales order. Each payment record outputted is separated by a comma character. The hook will be empty if the Sales Order Payments data import is not assigned to the same data source as this data import.

Sales Order Lines

Imports a line set against a sales order. The line could represent a quantity of a product, download, or labour sold to a customer. The line record data is obtained from an ESDRecordOrderSaleLine record when imported. The data is joined with the following data imports:

  • Sales Orders
  • Sales Order Line Taxes
  • Sales Order Line Product Deliveries
  • Sales Order Attribute Profiles

This data import will only be called if there are lines assigned to a Sales Order being imported. This data import will be called for each line record assigned to the sales order after the Sales Orders data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Sales Order Lines data import before the Sales Orders data import. The Sales Order Line Taxes, Sales Order Line Attribute Profiles, and Sales Order Product Deliveries will all be called before calling the Sales Order Lines data import for the JSON/XML/CSV data type.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Sales Orders data import otherwise it won't be called.

JSON/XML/CSV - Web Service/File Data Hooks

Any of the following data hooks may be embedded within the Data Import Content field, which will get replaced with values at the time the data import is called.

{taxes} - contains the collated set of JSON records from each of the Sales Order Line Taxes data imports called for each of the line tax records assigned to a sales order line. Each line tax record outputted is separated by a comma character. The hook will be empty if the Sales Order Line Taxes data import is not assigned to the same data source as this data import.

{productDeliveries} - contains the collated set of JSON records from each of the Sales Order Line Product Deliveries data imports called for each of the product delivery records assigned to the sales order line. Each product delivery record outputted is separated by a comma character. The hook will be empty if the Sales Order Line Taxes data import is not assigned to the same data source as this data import.

{attributes} - contains the collated set of JSON records from each of the Sales Order Line Attribute Profiles data imports called for each of the attribute profiles assigned to the sales order line. Each attribute profile record outputted is separated by a comma character. The hook will be empty if the Sales Order Line Attribute Profiles data import is not assigned to the same data source as this data import.

Sales Order Text Lines

Imports a text line set against a sales order. The text line contains text providing details or instructions about the order. an ESDRecordOrderSaleLine record when imported. The data is joined with the Sales Orders data import.

This data import will only be called if there are text lines assigned to a Sales Order being imported. This data import will be called for each text line record assigned to the sales order after the Sales Orders data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Sales Order Text Lines data import before the Sales Orders data import.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Sales Orders data import otherwise it won't be called.

Sales Order Line Taxes

Imports a tax applied against a sales order line. The line tax defines the kind of tax and monetary amount applied against the line. The line tax record data is obtained from an ESDRecordOrderLineTax record when imported. The data is joined with the Sales Order Lines data imports to allow both the taxes applied against the sales order line, and the sales order line to be imported.

This data import will only be called if there are taxes assigned to a Sales Order Lines being imported. This data import will be called for each tax record assigned to the sales order line after the Sales Order Lines data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Sales Order Lines Taxes data import before the Sales Orders Lines data import.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Sales Order Lines data import otherwise it won't be called.

Sales Order Line Attribute Profiles

Imports an attribute profile assigned against a sales order line. The attribute profile defines how attributes are grouped against the line. The attribute profile record data is obtained from an ESDRecordOrderLineAttributeProfile record when imported. The data is joined with the Sales Order Lines data imports to allow both the attributes assigned against the sales order line, and the sales order line to be imported.

This data import will only be called if there are attribute profiles assigned to a Sales Order Lines being imported. This data import will be called for each attribute profile record assigned to the sales order line after the Sales Order Lines data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Sales Order Line Attribute Profiles data import before the Sales Orders Lines data import.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Sales Order Lines data import otherwise it won't be called.

JSON/XML/CSV - Web Service/File Data Hooks

The following data hooks may be embedded within the Data Import Content field, which will get replaced with values at the time the data import is called.

{values} - contains the collated set of JSON records from each of the Sales Order Line Attributes data imports called for each of the line attribute records assigned to a sales order line. Each attribute record outputted is separated by a comma character. The hook will be empty if the Sales Order Line Attributes data import is not assigned to the same data source as this data import.

Sales Order Line Attributes

Imports an attribute assigned against a sales order line's profile. The attribute defines a field and value assigned against an order line, such as a comment, or other data. The attribute record data is obtained from an ESDRecordOrderLineAttribute record when imported. The data is joined with the Sales Order Line Attribute Profiles data import to allow both the attributes assigned against the sales order line, and the sales order line attribute profile to be imported.

This data import will only be called if there are attributes assigned to a Sales Order Lines being imported. This data import will be called for each attribute record assigned to a sales order line's attribute profile after the Sales Order Line Attribute Profiles data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Sales Order Line Attributes data import before the Sales Orders Line Attribute Profiles data import.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Sales Order Line Attribute Profiles data import otherwise it won't be called.

Sales Order Product Deliveries

Imports a product delivery associated to a sales order line. The product delivery indicates the quantity of product ordered that has been delivered to the customer. The product deliveries record data is obtained from an ESDRecordOrderProductDelivery record when imported. The data is joined with the Sales Order Lines data imports to allow the product deliveries applied recorded against the sales order line to be imported.

This data import will only be called if there are product deliveries assigned to a Sales Order Lines being imported. This data import will be called for each delivery record assigned to the ales order line after the Sales Order Lines data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Sales Order Product Deliveries data import before the Sales Orders Lines data import.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Sales Order Lines data import otherwise it won't be called.

Sales Order Payments

Imports a payment assigned against a sales order. Each payment records the monetary amount that has been paid against the sales order. The payment record data is obtained from an ESDRecordOrderPayment record when imported. The data is joined with the Sales Orders data import to allow both the payments recorded against the sales order to be imported.

This data import will only be called if there are payments assigned to a Sales Order being imported. This data import will be called for each payment record assigned to the sales order after the Sales Orders data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Sales Order Payments data import before the Sales Orders data import.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Sales Orders data import otherwise it won't be called.

Sales Order Surcharges

Imports a surcharge applied against a sales order. Each surcharge indicates the monetary fee being charged to the order, such as for freight, or minimum order surcharges. The surcharge record data is obtained from an ESDRecordOrderSurcharge record when imported. The data is joined with the Sales Orders data import to allow both the surcharges set against the sales order to be imported.

This data import will only be called if there are surcharges assigned to a Sales Order being imported. This data import will be called for each surcharge record assigned to the sales order after the Sales Orders data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Sales Order Surcharges data import before the Sales Orders data import.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Sales Orders data import otherwise it won't be called.

JSON/XML/CSV - Web Service/File Data Hooks

The following data hook may be embedded within the Data Import Content field, which will get replaced with values at the time the data import is called.

{taxes} - contains the collated set of JSON records from each of the Sales Order Surcharge Taxes data imports called for each of the surcharge tax records assigned to a sales order surcharge. Each surcharge tax record outputted is separated by a comma character. The hook will be empty if the Sales Order Surcharge Taxes data import is not assigned to the same data source as this data import.

Sales Order Surcharge Taxes

Imports a tax applied against a sales order surcharge. The surcharge tax defines the kind of tax and monetary amount applied against the surcharge. The surcharge tax record data is obtained from an ESDRecordOrderLineTax record when imported. The data is joined with the Sales Order Surcharges data imports to allow both the taxes applied against the sales order surcharge to be imported.

This data import will only be called if there are taxes assigned to a Sales Order Surcharges being imported. This data import will be called for each tax record assigned to the sales order surcharge after the Sales Order Surcharges data import for all data source types except for the JSON/XML/CSV - Web Service/File data source type, where it will call the Sales Order Surcharge Taxes data import before the Sales Orders Surcharges data import.

For the JSON/XML/CSV - Web Service/File data source type it must be assigned to the same data source as the Sales Order Surcharges data import otherwise it won't be called.

Using Sessions When Calling JSON/XML/CSV Web Services

The Generic adaptor supports the ability to login/create sessions before requesting to read or write data for all data exports/imports when using the JSON/XML Web Services data source type. When the JSON/XML Web Service/File data export type is assigned to a data export and the Data Source Session Create data export is configured, then any data export request will first call the Data Source Session Create data export to attempt to login to the web service, create a session, and have the ID of the session returned on the adaptor.

Data Exports With Generic Adaptor Session Authentication

{dataSourceSessionID} Data Hook

If a session is successfully created when the Data Source Session Create data export is called, then the data hook {dataSourceSessionID} can be placed into Data Export URL and HTTP Request Headers fields when any other data exports are assigned to a JSON/XML/CSV - WebService/File data source type.

Destroying Sessions

Once data has been obtained then if the Data Source Session Destroy data export has been configured then Generic adaptor will call it to attempt to logout of the web service and have its session destroyed. This ensures that sessions do not live longer than neccessary and reduces the risk of 3rd parties being able to access the session and get access to sensitive data. If the Data Sourec Session Create data export is being configured then it is critically important that the Data Source Session Destory data export is also configured to ensure that sessions are actively managed.

 

Create A Generic Adaptor Within The Connector

To create a new Generic adaptor within the Connector application, follow these steps:

  1. Open the Connector application
  2. Within the Adaptors and Messages tab, under the Adaptors section. in the adaptors drop down select the Generic Adaptor Version 1 option.
  3. In the Name and Key textbox type in a name to label the Generic adaptor. Give it a name that indicates the kind of data or data source that the Generic adaptor pushes and pulls data from.
  4. Click on the Add button.

The Generic adaptor will be added to the table of adaptors set up in the Connector. Next configure the general settings of the adaptor.

  1. Within the Adaptors and Messages tab of the Connector application click on the Adaptor button of the Generic adaptor table row previously set.
  2. Within the Adaptor Settings window set each of the adaptor's settings. Read the Adaptor Setup page on how to configure each of the settings.

Set Up a Data Source Within the Generic Adaptor

To add a data source to an existing Generic adaptor within the Connector follow these steps:

  1. Within the Adaptors and Messages tab of the Connector application click on the Settings button of the Generic adaptor that you wish to add a data source to.
  2. In the Generic Adaptor Settings window, under the General tab, in the Data Source Types drop down choose the kind of data source that you wish to create. You can choose from one of the following options:
    • Microsoft SQL Server
      Allows data to be read or written to a Micronet SQL Server database
       
    • ODBC
      Allows data to be read or written to a ODBC compatible database
       
    • HTTP - Ecommerce Standards Document
      Allows data to be read or written to a Web Service that supports the Ecommerce Standards Document data format, using the HTTP/S protocol. 
       
    • CSV Spreadsheet File
      Allows data to be read or written to one or more CSV spreadsheet files, based on being formatted in the same consistent way.
       
    • JSON/XML/CSV Web Service/File
      Allows data to be read or written to a JSON file, XML file, JSON compatible web service, or a XML compatible web service using the HTTP/S protocol.
  3. In the Data Source Label textbox give the data source a meaningful name, such as the name of the system or files that the data source connects to.
  4. Click the Add Data Source button.

The data source will be added to the Data Sources table. Once the data source has been created click on the Modify button to configure the settings for the data source. Once its settings have been configured restart the Connector Host Service to have the new data source loaded in the Connector Host. After this you can then start assigning the data source to each of the different Data Exports and Data Imports of the Generic adaptor, allowing different kinds of data to be read from or written to the data source.

Exporting/Uploading Product Images and Attachments To an Ecommerce Platform

Once a Generic adaptor is set up within the Connector there is the ability for it to upload both product image files and product attachment files to a chosen Ecommerce platform.This can automate the process of uploading these product files, allowing large amounts of files to be easily uploaded on a continual basis.

Exporting/Uploading Product Image Files

To allow the Generic adaptor to upload product image files a data source has to be first set up that will be used to obtain a list of image files to upload. The list needs to be stored in one of the Generic adaptor's supported data source formats, such as a CSV spreadsheet file, an ODBC/MS SQL compatible database table, an XML file, a JSON file, or being retrieved from another webservice.

Once a list of product image data has been set up then within the Generic adaptor the "Product Images" data export needs to be assigned to the data source and configured. The Product Images export requires that the following field data exists for each image record set up in the list of records:

Data Export Field Is Mandatory Description
filePath Yes

File path to locate the product image on the computer that the Connector is running on. The file path can be absolute (such as C:dataproductsimagesexample.jpg) or the file path can be relative (such as imagesexample.jpg).

If the file path is relative then the adaptor will prefix the value of the Product Image Directory setting in the adaptor infront of the filePath field.
For example if the Product Image Directory setting was set to "C:data", then the adaptor would place that in front of the relative filePath imagesexample.jpg, to make the absolute file path C:dataproductsimagesexample.jpg
Be careful if setting network drives in file paths since it's the Connector Host Service that performs the file retrieval, so the network drive needs to be accessible to the user that the Connector Host Service runs as (which by default is the System Windows user).

keyImageID Yes

Unique identifier of the product image. This identifier allows Ecommerce systems that the product images are being uploaded to to identify when the same product image is being uploaded again, and may overwrite the previously uploaded image for the product instead of appending a new image.

In database tables typically the keyImageID would be set to a numeric primary key of the table. If no unique identifier can be easily found then its recommended to use the file name of the image file. 

keyProductID Yes Unique identifier of the product associated to the product image. This allows Ecommerce systems to determine which product the image file is being uploaded against. It's recommended to set this as the same value set in the keyProductID field set in the "Products" data export for each product.
Note that the keyProductID may not be the same as the productCode, if the products are stored in a data source that assigns a different unique identifier for each product.
imageDescription No Text that describes aspects of the product image or provides additional information about the image.
imageTitle No Title of the image that provides an image for the label.

Below is an example CSV spreadsheet file data that the Product Images data export could be configured to read from.

keyImageID,keyProductID,filePath,imageDescription,imageTitle
1,PROD123,C:\data\products\images\img-123.jpg,Diagram1 of Product 123,Product 123 Diagram1
2,PROD123,C:\data\products\images\img-123.jpg,,Product 123 Diagram2
456[1],PROD456,images\456[1].jpg,,

Once the Product Images data export has been configured in the Generic adaptor then it is recommended to restart the Connector Host Service to ensure all new data source configurations have been loaded into the host service.

To start uploading product images follow these steps:

  1. Open the Connector application
  2. In the Adaptors and Messages tab click on the Exports/Routines button for the configured Generic adaptor
  3. In the Adaptor Export/Routine Schedule dialog, in the Data Exports tab select the Product Images export in the list.
  4. Select the Ecommerce platform to send the product images to.
  5. In the Data Transfer Mode drop down leave it set to Data Transfer Mode - Incremental.
  6. Click the Run Once Off button.

This will cause the Connector Host Service to do the following:

  1. Call the Generic adaptor to run the Product Images data export
  2. Retrieve the list of product image records from the configured data source assigned to the data export.
  3. Iterate through each product image record, and follow these steps for each image:
    1. Locate the image file based on the path set in the product image record
    2. Check if the image file size is under 2000px in width or height, if not then it will resize the image to fit within the maximum pixel dimensions.
    3. Upload the image file the selected Ecommerce platform.
  4. Continue to iterate through all product image records until all have been processed.
  5. Update the Data Export History logs to give a report the amount of product images that have been successfully uploaded.

If a product image file could not be found, resized, or uploaded then a error log will be raised withiin the Connector messages, that can be found in the Connector application under the Adaptors and Messages tab. If an error occurred with one image file the export will still continue to process the remainder of the images in the record list (unless a critical error occurred).

Tips and Advice

  • When initially running the Product Images data export it is recommended to only try to uploading a handful of images first to check that the Product Images data export and its assigned data source have been configured correctly.
  • The Connector only supports uploading JPEG, GIF, or PNG product image files.
  • If uploading JPEG image ensure that the colour profile saved for the images is in the RGB colour format. This ensures that the images will display correctly on websites. Use Google to find tools that can automate changing the colour profiles of JPEG images in bulk if your image JPEG image files are saved in a different colour profile.
  • Before running a Product Images data export, in the Adaptors and Messages tab change the Show Logs Types setting to ALL to be able to see extra information about what the Connector Host is doing when a Product Images data export is being run, such as which product image file is being uploaded.
  • If uploading a large amount of product images on a frequent basis (such as 100 images or more) it recommended to only run the data export every 1 to 4 weeks.
  • It may be possible to use an accounting or ERP system's database to generate the list of product image records. This could done by configuring the Generic adaptor's Product Images data export to get records from the Products database table, and then building a the filepath of the product images based on a product's code or other stored information. By setting up the image's filePath based on a product code, the images filenames could be set to match the product code, allowing the images to be found and uploaded.
  • Note that the TOTECS Ecommerce platform will only either append a new product image to a product, or replace an existing image. Administrators will manually need to delete images from products within the platform's Administration Centre, within the Product Editor.

Exporting/Uploading Product Attachment Files

To allow the Generic adaptor to upload product attachment files a data source has to be first set up that will be used to obtain a list of attachment files to upload. The list needs to be stored in one of the Generic adaptor's supported data source formats, such as a CSV spreadsheet file, an ODBC/MS SQL compatible database table, an XML file, a JSON file, or being retrieved from another webservice.

Once a list of product attachment data has been set up then within the Generic adaptor the "Product Attachments" data export needs to be assigned to the data source and the export configured. The Product Attachments export requires that the following field data exists for each attachment record set up in the record list:

Data Export Field Is Mandatory Description
filePath Yes

File path to locate the product attachment on the computer that the Connector is running on. The file path can be absolute (such as C:dataproductsattachmentsexample.pdf) or the file path can be relative (such as attachmentsexample.pdf).

If the file path is relative then the adaptor will prefix the value of the Product Attachment Directory setting in the adaptor infront of the filePath field.
For example if the Product Attachment Directory setting was set to "C:data", then the adaptor would place that in front of the relative filePath attachmentsexample.pdf, to make the absolute file path C:dataproductsattachmentsexample.pdf
Be careful if setting network drives in file paths since it's the Connector Host Service that performs the file retrieval, so the network drive needs to be accessible to the user that the Connector Host Service runs as (which by default is the System Windows user).

keyAttachmentID Yes

Unique identifier of the product attachment. This identifier allows Ecommerce systems that the product attachments are being uploaded to to identify when the same product attachment is being uploaded again, and may overwrite the previously uploaded attachment for the product instead of appending a new attachment.

In database tables typically the keyAttachmentID would be set to a numeric primary key of the table. If no unique identifier can be easily found then its recommended to use the file name of the attachment file. 

keyProductID Yes Unique identifier of the product associated to the product attachment. This allows Ecommerce systems to determine which product the attachment file is being uploaded against. It's recommended to set this as the same value set in the keyProductID field set in the "Products" data export for each product.
Note that the keyProductID may not be the same as the productCode, if the products are stored in a data source that assigns a different unique identifier for each product.
attachmentTitle No Title of the attachment file that provides an label that users can recognise it by.

Below is an example CSV spreadsheet file data that the Product Attachments data export could be configured to read from.

keyAttachmentID,keyProductID,filePath,attachmentTitle
1,PROD123,C:\data\products\attachments\att-123.pdf,Product 123 Diagram1
2,PROD123,C:\data\products\attachments\att-123.docx,Product 123 Diagram2
456[1],PROD456,attachments\456[1].xls,

Once the Product Attachments data export has been configured in the Generic adaptor then it is recommended to restart the Connector Host Service to ensure all new data source configurations have been loaded into the host service.

To start uploading product attachments follow these steps:

  1. Open the Connector application
  2. In the Adaptors and Messages tab click on the Exports/Routines button for the configured Generic adaptor
  3. In the Adaptor Export/Routine Schedule dialog, in the Data Exports tab select the Product Attachments export in the list.
  4. Select the Ecommerce platform to send the product attachments to.
  5. In the Data Transfer Mode drop down leave it set to Data Transfer Mode - Incremental.
  6. Click the Run Once Off button.

This will cause the Connector Host Service to do the following:

  1. Call the Generic adaptor to run the Product Attachments data export
  2. Retrieve the list of product attachments records from the configured data source assigned to the data export.
  3. Iterate through each product attachment record, and follow these steps for each attachment:
    1. Locate the attachment file based on the path set in the product attachment record
    2. Upload the attachment file to the selected Ecommerce platform.
  4. Continue to iterate through all product attachment records until all have been processed.
  5. Update the Data Export History logs to give a report the amount of product attachments that have been successfully uploaded.

If a product attachment file could not be found or uploaded then a error log will be raised withiin the Connector messages, that can be found in the Connector application under the Adaptors and Messages tab.If an error occurred with one attachment file the export will still continue to process the remainder of the attachments in the record list (unless a critical error occurred).

Tips and Advice

  • When initially running the Product Attachments data export it is recommended to only try to uploading a handful of attachments first to check that the Product Attachments data export and its assigned data source have been configured correctly.
  • Before running a Product Attachments data export, in the Adaptors and Messages tab change the Show Logs Types setting to ALL to be able to see extra information about what the Connector Host is doing when a Product Attachments data export is being run, such as which product attachment file is being uploaded.
  • If uploading a large amount of product attachment on a frequent basis (such as 100 attachments or more) it recommended to only run the data export every 1 to 4 weeks.
  • It may be possible to use an accounting or ERP system's database to generate the list of product attachment records. This could done by configuring the Generic adaptor's Product Attachments data export to get records from the Products database table, and then building a the filepath of the product attachments based on a product's code or other stored information. By setting up the attachment's filePath based on a product code, the attachment filenames could be set to match the product code, allowing the attachment to be found and uploaded. Each time a new attachment file is to be added on the filesystem an administrator person just needs to set the file name to match the product code that its associated to.
  • Note that the TOTECS Ecommerce platform will only either append a new product attachment to a product, or replace an existing attachment. Administrators will manually need to delete attachments from products within the platform's Administration Centre, within the Product Editor.

Adaptor Presets

Within the Generic adaptor there the ability to save all of its configurations (properties, data sources, data exports and data imports) to an adaptor preset file. This adaptor preset file can then be copied across and added to the preset list for any Connector installed on the same or different computers. The adaptor preset file can then be loaded into a Generic adaptor, adding its configured data sources to the adaptor, as well as updating the properties, data exports and data imports in the adaptor based on how the preset file was originally configured.

Adaptor Preset Benefits

Using adaptor presets provides the following benefits:

  • Allows configurations of a Generic adaptor to be saved to a file and exported or backed up to other computers.
  • Allows Generic adaptors to be quickly set up based on previous configurations.
  • Allows Generic adaptors to be quickly set up based on well known configurations for specific systems and data sources that the Generic adaptor can handle.
  • Reduces errors and time in setting up Generic adaptors
  • Allows knowledge to be shared on the Generic adaptor can be configured in known ways.
  • Allows 3rd parties to tailor adaptor configurations for their supported software and share these configurations to their clientel.
  • Tailor settings to how the Generic adaptor is set up for specific software and data sources.

Adaptor Preset File Format

Adaptor Preset files are saved in the JSON (JavaScript Object Notation) file format that has a .json file extension. The JSON file format is a human readable text file format that can be viewed with any text editor application. This file can be easily passed across between computers and typically will have the size of 0.5-2MB in size. It is highly recommended to avoid directly modifying the contents of this file and instead use the Connector application to create, save and load the contents of this file. If it is incorrectly modified the Connector application may not be able to read the file, or if loads the file in may not be able to correctly configure the Generic adaptor.

Adaptor Preset Data Hooks

When a Generic adaptor is to be saved as an adaptor preset, in each of the adaptor's properties, data source properties, data export and data import Custom Data Field Definitions there is the ability to set Adaptor Preset Data Hooks. These data hooks can be set in the form:

<<preset_hook_name>>

The double opening and closing angled brackets contain the name of the preset data hook. The preset hook name text change be set to any text you wish to name the hook as.

When the adaptor is saved as a preset file, the Connector application will go through all the adaptor's properties and fields and build up a list of the unique preset data hooks. You can then set a label, description and value data type for preset data hook. After this is done then when the adaptor preset file is loaded back into a Generic adaptor, the user will be prompted to set a value for each of the preset data hooks you defined. Once they have set all the preset hook values then when the preset is loaded into the adaptor it will replace all of the preset data hooks with the values that the user has set.

Using Preset Data Hooks allows you to create an unlimited number of settings to help users configure the Generic adaptor based on the software or data sources your preset targets, using language that is familiar to that software or data source.

Bundled Adaptor Presets

When a Connector application is installed it will already come bundled with a number of Generic Adaptor Presets based on well known configurations. These bundled adaptor presets can be used without having to add in any preset file, and are guaranteed to exist in all Connector application installations. The bundled adaptor presets cannot be removed from the Connector application, and over time may change or be added to as newer versions of the Connector application are released.

The following bundled presets are included within the Generic adaptor:

  • SQUIZZ.com Connector Data Set
    Configures the Generic adaptor with an SQLite data source type, and configures all data exports that enable exporting data out of a Connector's Data Set database file, at the specified location.
     
  • SQUIZZ.com API (Supplier Product Data Retrieval)
    Configures the Generic adaptor with a JSON/XML/CSV Web Service/File data source type that connects to the SQUIZZ.com API, then configures the Products, and Product Stock Quantities, Product Account Pricing data exports to obtain data from a supplier organisation who has hosted their data on the SQUIZZ.com platform.
     
  • Ecommerce Standards Documents - Data Imports To JSON Files
    Configures the Generic adaptor with a JSON/XML/CSV Web Service/File data source type that is used with all of the adaptors data imports, to allow sales orders, purchase orders, customer account payments and supplier invoices to be saved to files, at a designated filesystem location, in the JSON data format conforming to the Ecommerce Standards Documents data structure.
     
  • Ecommerce Standards Documents - Data Imports To XML Files
    Configures the Generic adaptor with a JSON/XML/CSV Web Service/File data source type that is used with all of the adaptors data imports, to allow sales orders, purchase orders, customer account payments and supplier invoices to be saved to files, at a designated filesystem location, in the XML data format conforming to the Ecommerce Standards Documents data structure.
     
  • Micronet Harmoniq
    Configures the Generic adaptor with JSON/XML/CSV Web Service/File data source types to connect to the Micronet Harmoniq business system's SOAP based web service. Many data exports are set up to export data from the web service, as well as allow sales orders to be imported into Harmoniq.
     
  • Currency Exchange RatesAPI.io
    Configures the Generic adaptor with JSON/XML/CSV Web Service/File data source types to connect to the Currency Exchange Rates API.io web service to obtain exchange rates, across many popular currencies that the European Bank provides data for. Use this preset to continually up date indiicative exchange rates in associated Ecommerce platforms.
     
  • Readysell
    Configures the Generic adaptor with both the Ecommerce Standards Documents and the JSON/XML/CSV Web Service/File data source types to connect to the Readysell Enterprise Resource Planning (ERP) system's web service. The preset configures all data exports required to allow an organisation to sell within connected Ecommerce platforms. The preset also allows sales orders to be imported back into Readysell.
     
  • SQUIZZ.com PARtsDB Addon Connector
    Configures the Generic adaptor with a SQLite data source type to obtain make/model, category, and product data that was exported from the PARtsDB web platform, and saved by the SQUIZZ.com PARtsDB Addon Connector to a SQLite database. This make/model data may be used to allow organisations to offer make/model catalogue searching capabilities within connected Ecommerce platforms.
     
  • Micronet Distribution Sales Order CSV file data import
    Configures the Generic adaptor with a JSON/XML/CSV Webservice/File data source type to generate the sales order as a CSV spreadsheet file. This allows the sales order CSV file to be saved on the file system that can then be imported into the Micronet Distribution system through its DataConnect software. This allows all the available sales order data to be customised and embedded within the CSV file.The preset generates CSV file data in a same overall structure as the Micronet Distribution system's adaptor, as well as allowing for further customised business logic to be set in the sales order CSV file using the JSON/XML/CSV field functions.
     
  • Channel Advisor - Sales Orders Data Exports
    Configures the Generic adaptor with a JSON/XML/CSV Webservice/File data source type to retrieve a list of sales orders from the Channel Advisor platform from the last number of days, then export the sales orders to allow them to be imported into a selected Ecommerce platform, such as SQUIZZ.com. This preset allows organisations that are selling on consumer based marketplaces that Channel Advisor supports (Ebay, Amazon, Kogan, Catch, etc..) to have the sales orders raised in those marketplaces be retrieved and imported into the organisation's business systems using both the Generic adaptor, and the Ecommerce platform.
    In order for the preset to work an organisation must have previously set up an account on Channel Advisor platform, a new or existing Channel Advisor developer account has been created, with an "app" registered against it and then linked to the Channel Advisor organisation's account. Additionally the organisation must have been previously registered on the Ecommerce platform the sales orders are being imported into, and have imported their sell units, taxcodes, surcharges, payment types, products and customer accounts into the Ecommerce platform.

Create An Adaptor Preset File

To create an adaptor preset file follow these steps:

  1. Open the Connector application
  2. In the Adaptors and Messages tab click on the Settings button for a Generic adaptor you wish to create a preset for.
  3. In the Generic Adaptor Settings window configure the adaptor to your needs.
  4. Click on the Create button at the top of the window to create an adaptor preset.
  5. In the Adaptor Preset window in the Preset Name text box field type a name for the preset. Set a name that allows the preset to be easily understood, such as the name of the software or data sources that it is configured to read/write data from.
  6. In the Preset Description text area type in a more detailed description of what the preset is, how it configures the adaptor, or any important information that needs to be known about it.
  7. In the Preset File Path text box browse and set the full file path for where the preset file is going to be saved to. Ensure that the file name contains the .json extension.
  8. In the Preset Data Hooks table for any preset data hooks that you have defined against the adaptors settings, data sources, data imports and data exports do the following for each data hook row:
    1. In the Label column set a label of the data hook that users will see when needing to set a value for the data hook. Set a label that users will be able to recognise.
    2. In the Default Value column type in text that will be set by default in the data hook field when users will see when loading the preset.
    3. In the Data Type column choose a data type that you wish for the value that the user settings in the data hook will be converted into when saved.
    4. In the Description column set a longer description that helps the user to understand what value should be set in the preset data hook when loading the adaptor preset.
  9. Click on the Save button.

The Connector application will save an adaptor preset file to location specific on the file system containing all the adaptor's settings, data sources, data imports and data export configurations. If the file successfully saves then the new adaptor preset will also be added to the list of presets for the Generic adaptors

Add An Existing Adaptor Preset File To A Connector Application

To add an existing adaptor preset file into a Connector Application installed on a computer follow these steps:

  1. Open the Connector application
  2. In the Adaptors and Messages tab click on the Settings button for any Generic adaptor.
  3. Click on the Add button.
  4. In the Open dialog find and select the adaptor preset file.

The Connector application will try to open the preset file and check that it is a valid preset file. If this check succeeds then the adaptor preset will be added to the list of presets in the Generic Adaptor Settings window, and be available to load for any Generic adaptors that exist in the Connector.

Load An Adaptor Preset Into A Generic Adaptor

To load an adaptor preset into a Generic adaptor and allow its data sources, settings, data import and data export configurations to be changed, follow these steps below:

  1. Open the Connector application
  2. In the Adaptors and Messages tab click on the Settings button for the Generic adaptor you wish to load the adaptor preset into.
  3. In the Load Adaptor Preset drop down choose the adaptor preset you wish to load.
  4. Click on the Load button.
  5. In the Load Adaptor Preset confirmation dialog choose Yes if you wish to load the adaptor preset. By clicking Yes you allow the adaptor preset to add additional data sources to the adaptor, as well as allow all existing properties to be modified, and any data imports and data export configurations to be overwritten. It's highly recommended to create an adaptor preset first if you are unsure if you wish to allow these changes to be made, since this will allow you to get the adaptor back to original state if required.
  6. If the Adaptor Preset contains data hooks then the Adaptor Preset Data Hook Settings window will appear. For each row in the Value column you need to set a value for the data hook. Read the label of the Data Hook Setting as well as the Description to understand the kind of value that you should set.
    1. Click on the Save Settings button.

The Connector application will then try to add any data sources to the Generic adaptor that exist within the adaptor preset. It will overwrite any settings in the adaptor's General tab. It will change any data imports and data imports that were configured when the adaptor preset was created. If all is successful then a message will display advising that the preset was successfully loaded, otherwise an error message will display advising why it could not be loaded.

Remove A Preset File From A Connector Application

To remove an existing adaptor preset file from a Connector Application installed on a computer follow these steps below. Note that removing an adaptor preset will not delete its preset file from the file system, it will only remove it from the list of adaptor presets within the Connector application.

  1. Open the Connector application
  2. In the Adaptors and Messages tab click on the Settings button for any Generic adaptor.
  3. In the Load Adaptor Preset drop down choose the adaptor preset you wish to remove.
  4. Click on the Remove button.

The Connector application will try to remove the adaptor preset from the list. If successful the preset will no longer appear in the list. If you had accidently removed the preset file you can add it back to the list by following the steps in the Load An Adaptor Preset Into A Generic Adaptor section above. Note that it is not possible to remove an adaptor preset from the list that comes bundled when the Connector is installed.

Adaptor Routines

The Generic adaptor has a number of routines listed below can perform different tasks with each adaptor instance. Routines can be scheduled to run a regular intervals, to allow repeated tasks in web services to called at specific times and data. The Generic adaptor contains the following routnes:

  • Generic HTTP Get Request
    The routine makes a HTTP GET request to any web service to call a specified URL. This can allow the routine to trigger events and actions on remote web servers. The routine may be used for polling, to make sure a remote web service is running, or may be used to start a process or begin processing data within the called web service.
     
  • SQUIZZ.com Reconcile And Send Customer Invoices
    This routine looks at all sales orders assigned to the adaptor, and match up to invoices using the Customer Account Enquiry Invoice Record search, then for all invoices found matching orders send the invoices across into SQUIZZ.com to a connected customer organisation. This allow invoices to automatically be sent to customer organisations who are registered on the SQUIZZ.com and have connected their business system.
     
  • Run Batch Script
    The routine runs a batch script file at a specified location on the file system, and executes each of the operating system commands that exist within the batch script file. By using this routine it effectively allows the Connector to modify different aspects of the operating system, or trigger other software to perform actions within Microsoft Windows.
     
  • Missing Dataset Product Field Data Report
    The routine generates a spreadsheet CSV report file that outlines the products within a Data Set that are missing data. Optionally the routine can send out an email containing the report's spreadsheet file, as well as the summary of the product data field counts that contain missing data. The report can make it easier for data maintainers to determine what product data they need to add, to ensure products can be seen and sold across connected Ecommerce platforms.
     
  • SQUIZZ.com - Smartfreight Customer Freight Delivery Notice Export
    The routine looks at sales orders imported into the Connector against its adaptor, and tries to find freight delivery/tracking data associated to the order within SmartFreight's system against a linked consignment note. If found then a delivery notice is created and sent back to the customer via SQUIZZ.com. This allows the customer (or associated systems) to be aware of the progress of the ordered goods being delivered, as well as automating freight data being retrieved and sent out from SmartFreight's system, possibly saving many hours of manual key entry and associated costs for both suppliers and customers. This routine may be used to send Advanced Shipping Notices (ASN).

How To Set Up a New Generic Adaptor Routine

To set up a new Generic adaptor routine follow these steps:

  1. Open the Connector application
  2. Within the Adaptors and Messages tab, in the adaptors list find a Generic adaptor you wish to create a routine for, then click on the Settings button.
  3. Within the Generic adaptor's Settings window click on the tab labelled "Data Routines".
  4. In the Routine Types drop down select the type of routine you wish to create.
  5. Click on the Create Routine button to create the routine.
  6. In the Routine Code textbox type the name of the routine. Give it name that is unique to the adaptor.
  7. Click the Save button to update the routine's code.
  8. In the Settings tab if the routine contains any settings then set the values in the Setting Value column.
  9. Click on the Data Fields to view and set any data fields required for the routine.

The routine will now be created. To learn more about what the routine does and how it needs to be configured click on the routine links above to find more details about it.

JSON/XML/CSV Data Field Functions

If the Generic adaptor is setup to use a "JSON/XML/CSV - Webservice/File" data source type to query data in data exports and data imports, then the JSON/XML/CSV field functions can be used within the "Custom Data Field Definition" of a data export or data import, to query and manipulate the JSON data mapped to each field before it is sent out of the Connector. The functions accept nesting other functions inside their arguments, however try to avoid too much nesting since each function incurs a processing cost which could add up to longer waiting times for a data export to complete.

A complete list of the field functions can be found at JSON/XML/CSV Data Field Functions.