Understanding the API
The MYPOS Connect API is a CRUD (Create, Read, Update, Delete) Web API that uses JSON and operates over HTTPS. Once you have been given an account to access the API you can use the login method below to obtain a Bearer Token that you can then use with subsequent API methods.
Your account will be assigned appropriate read/write/delete permissions for different entities within the MYPOS Connect system. For example, depending on the project you are developing your login might be able to access products but not customers. In addition it might be you can read products but not update them. The API authorisation allows for different modules per user and different access rights for each module. Each entity has a corresponding document detailing its API methods.
In general the methods in the API return http codes of 200 (OK) or 202 (Accepted) for success, and 400 (BadRequest) for failure (with information in the body content).
Other possible return codes are 403 (Forbidden) which means you have not been given access to that method, 401 (Unauthorised) which most likely means your bearer token has expired), or 500 (Internal Server Error) which can be reported to our helpdesk).
The Bearer Token
Login Method to Obtain Bearer Token
End Point: https://api.myposconnect.com/api/v2/auth/token
Verb: POST
Return Codes :
200 for success. The return body contains the JSON with bearer token that you need to save for calling subsequent API calls.
400 (Bad Request) is returned for login failure, see “failureDescription” node in JSON body.
Notes: Use HTTPS Basic Authentication.
IMPORTANT FOR YOUR DESIGN! Bearer token will expire after two hours. The following are examples of what is returned in the response body for successful and failed login attempts respectively. Please note that this JSON may contain other data nodes.
IMPORTANT FOR YOUR DESIGN! You can only request 250 Bearer tokens in one twenty four hour period.
success
{
"failureDescription": null,
"bearerToken": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
}
failure
{
"failureDescription": "Invalid username or password",
"bearerToken": null
}
lipage and lipagesize
Avoiding time outs on large tables
Whereas some clients have small data tables, for example less than a thousand products or a thousand customers, other clients have very large tables with hundreds of thousands or millions of rows. Obviously a call to get all customers where a database table has a row count of say 3 million will simply result in your program timing out and becoming blocked until the new 24 hour period.
To allow for correct access to large datasets we have added paging so that the data can be retrieved in smaller datasets. This might be for example used in conjunction with the lasteditdateUTC column to get for example customers modified in the last hour.
Examples
If the products tables was a low record count I would tend to request them all at once like this:
https://api.myposconnect.com/api/v2/products?lipage=1&lipagesize=999999
So the lipagesize is set high and I just ask for page 1
If the clients database however had 100,000 products I might choose to get them in batches of 2000 like the below
Note: the "liTotalCount" field is always returned so you can calculate how many pages there are at the size request you sent - i.e. if total count was 84590 and you requested a page size of 2K then there would logically be 43 pages ( 42x2000=84000 and then the 43rd page has the remaining 590 items)
https://api.myposconnect.com/api/v2/products?lipage=1&lipagesize=2000
https://api.myposconnect.com/api/v2/products?lipage=2&lipagesize=2000
https://api.myposconnect.com/api/v2/products?lipage=3&lipagesize=2000
and so forth.
sFieldList
Another field worth mentioning on API calls is "sFieldList" - if you know that you only want a few of the values for the products you can use sFieldList to limit the amount of data that comes back, for instance...
https://api.myposconnect.com/api/v2/products?lipage=1&lipagesize=1&sfieldlist=productcode,productReportingClass,fullDescription
and this will limit the data use and also will be much faster in response.