Skip to main content

Overview

Foreword

The first thing to do when you want to set up an automated data exchange between your system and Actito is to answer a few questions:

  • Is real-time synchronization necessary?
  • Is the synchronized data involved in real-time scenarios?
  • What volume are you handling?
  • What are your technical stack/staff able to deal with?
  • Are the answers to those previous questions compatible with the limits established by the Actito Integration Framework?

Once done, you can start to dig in the right case and learn how you can work with it!

Solutions Overview

In order to manage a wide range of situations and to fit with most technical stacks, Actito Integration Framework provides two types of entry points to set up data synchronization towards your Actito tables:

The following table helps you finds the most suitable solution.

API callsFile transfer server (FTP) synchronizations
Real-time synchronization1 by 1 API/
Multiple times a day (non-fixed frequency)Automated ETLs triggered by APIAutomated ETLs triggered by file synchronization
Once a day (fixed frequency)Automated ETLs triggered by APIScheduled automated ETLs
Occasionally with variable parametersOne shot ETLs/

Technical Resources

The first distinction, in the headers of the table, revolves around your technical resources and preferences.

  • If you do not have developers on staff or want to minimize developments/code writing, ETL synchronizations through a file transfer server are your best solution. These flat files exchanges rely on a file transfer server (FTPS or SFTP) where your only responsibility is to provide the file. Once an ETL is set-up, it will automatically retrieve the file and integrate the data in your tables.

  • If you have more technical resources and want to build more complex integrations by programming them yourself, you can rely on the Actito public APIs to code your calls and integrate your data exchanges in your stack.
    This offers more flexibility and more capabilities.

Frequency

In the columns of the table, we focus on the frequency of the data synchronization, and indirectly on their volume. Indeed, these 2 parameters are usually linked, with more frequent imports containing less data, and vice versa.

  • If you need to synchronize data as soon as there's a new data entry, so single record quantity in real-time, the best solution is to use the 1 by 1 API. This kind of synchronization must be programmed by your developers, as real-time synchronization cannot be achieved through flat file transfers.

  • If you need to synchronize batches of data once or multiple times per day, as soon as the file is available or at a variable frequency, there are several solutions depending on the technical resources at your disposal and the recurrence of these imports:

    • If your developers are available to code API calls, they can program bulk imports:
      • with automated ETLs triggered by API if these are recurring imports with the same parameters.
      • with one-shot ETL executions if the parameters change with every import.
    • If you favor flat file exchanges on a FTP server, it is possible to trigger an import automatically whenever a file is dropped on the FTP server thanks to automated ETLs triggered by file synchronization.
  • If you need to synchronize a large batch of data once a day, always at the same time, the best solution is to use a scheduled automated ETL through a FTP server.

Case by case Q&A

In the following section, we are presenting practical cases and the recommended solution for each of them.

I need to import data in real time

Whenever a user registers to my website or make an order, I want to push the data to Actito to trigger marketing actions immediately.

For real-time synchronization, your go-to option is to use the 1by1 API.

It implies creating/updating 1 single profile or record at a time and is therefore not appropriate to synchronize large amount of data at once.

I need to import data batches several times a day

To ensure accurate targetings, I want to synchronize orders data from my physical stores every hour during their opening hours.

When the data does not trigger immediate action but you still require a frequent, near real time synchronization, Actito provides you with several options to push data in batches:

info

In all 3 cases allowing near real time batch synchronization, the base limit is 12 bulk imports per table per day. If you require more frequent synchronization, please contact your account manager.

I need to import data bulk by transferring it through a FTP server

I do not have developers in my team to write complex code, but my CRM or ERP have options to export the source data as flat files on FTP servers.

Automated ETLs are ideal if you need or want to rely on a FTP server (which can be your own server or provided by Actito) for your data exchanges.

Depending on the required frequency, you have 2 options:

I need to import data in bulk by pushing a file through an API call

I do not want to use an FTP server to avoid intermediary steps. I have developers in my team to directly code API calls integrated in my data exchanges.

There are 2 methods to push a data file directly in an API call: 'one shot' ETL executions' and ETL executions 'triggered by API'

When to use 'one shot' or 'triggered by API'?

In a one shot execution, everything is defined on the go: you are pushing both the data file and the ETL definition in your call. One shot executions are mostly recommended for non-recurring and exceptional data flows, or if the parameters of your data flows are highly variable (ex.: your data are coming from multiples partners and the report recipient must be different depending on the source).

On the other hand, if you have recurrent data flows where the definition stays the same one day to another, you can create an ETL with a fixed definition and trigger it by API. In this case, your API call will only contain the data file, as the import parameters are known already.

I need to push data in several tables and I want them to be synchronized in the right order

I need to import both "customers" and "orders" data in bulk and I want to make sure that all customers have been synchronized before I start to import their orders.

It is possible to define 'multifiles' ETLs, where several files will be retrieved at once and processed in sequence. We advise to use this method instead of setting up several unrelated imports

I need to format my data when I import it

I want to capitalize the first letter of names imported in my profile table, even if the new subscribers input it in lowercase when they register.

Data transformations can be set up for all kinds of ETLs that enable you to import data in bulk: scheduled ETLs, triggered ETLs (both when the file is retrieved on a FTP server or pushed by API), or one shot ETLs.

It is not possible to apply data transformations on 1by1 API calls: the raw data should be processed in the code that triggers the call.