Push API
Example Integrations
below is a typical example of how you could consume realtime event data from jomablue into your existing business systems in this example, we use a generic crm for illustration purposes but this can work with any crm on the market that has an data importing api in this example, we are going to utilise aws sqs as well this gives us a bit more control over the "flow rate" of data coming form jomablue which allows us to slow the ingestion of data down, batch it and to adhere to our crm's rate limits we recommend that any data consumed from jomablue, should go through any existing 'lead ingest' workflows you have in your crm that way your business logic, deduplication or routing rules are still applied to jomablue data (keep in mind jomablue can create new person records as well as import your existing) jomablue can maintain crm identifiers throughout to help with matching data from jomablue with existing crm data the above is very top level we actually recommend following an existing workflow in your crm the same workflow that leads from your website would follow to be ingested in your crm depending on your existing infrastructure, you might consider a setup like the following which uses polling instead in this example, the data from jomablue is 'queued' in sqs, and you use a third party tool or some custom scripts within your crm to poll the sqs queue every 5 mins to fetch recent updates from the event sqs is advantageous in this situation as it can continue to hold data until you retrieve it on the next poll (which could be seconds, minute or hours) simplest approach sns to http endpoint the simplest approach to integration is to use sns to pass the data on to a http endpoint as a post so if the crm has a http api to create new leads sns can be configured to translate the data coming from jomablue into a http request and post it the biggest consideration here, is that the http endpoint can accept the volume of requests and wont be rate limited sns does support some rate limiting to http endpoints but you don't have full control transferring data to another cloud provider/platform for processing in the following example, a simple lambda function is used to pass the payloads received from jomablue into another cloud provider which is within the current ecosystem of this customer (ie, as apposed to used aws) the lambda could be connected to sns or sqs our recommendation would be sqs to give you the option to queue data and process it on your schedule this setup allows a small lambda function to live within an aws account and process the jomablue data and send it wherever is more suitable for your org to consume which could be processing it into another post api, storing it in csv files or in a database that exists somewhere else