list_objects- Get a list of the objects your API can write to.
list_fields- Get a list of the fields, along with data types and validation rules, for a given object.
supported_operations- Find out what kinds of "writes" can be performed on an object - can new instances be created? Can existing instances be modified?
get_sync_speed- How quickly should Census send data to your API?
sync_batch- Sync one batch of records to your destination, reporting success and failure for each record.
list_fieldsmethods). Census will use this information to populate its mapping API, and guide the users of your Custom API to map tables and columns in their data warehouse to your destination.
upsert(most common) - records in the destination can be created or modified
insert- records can only be created in the destination, they cannot be modified
update- records can only be modified in the destination, they cannot be created
updaterecords in the destination but not
insertthem, your Custom API must first check the destination to see if a matching record exists, and tell Census to skip it if it does not exist. Some destination systems may provide APIs like this for you (create this record only if it does not exist) if they have strong enforcement of uniqueness on identifiers.
upsertsync will include a field that should be used as the identifier for matching source and destination records. This field must be unique and required in both systems (the source data warehouse and the destination SaaS), and it will be provided for every record. Your Custom API will tell Census (via the
list_fieldsmethod) which fields can be legally used as identifiers.
https://census-custom-api.yourcompany.example, you could configure Census to invoke it as
https://census-custom-api.yourcompany.example?census_authentication=XoBcsUoFT82wgAcAand verify all calls include the correct
method: The name of the method Census is calling on your API. Methods are included in request bodies (instead of as a RESTful-style URL component) so that you can give Census a single URL as the integration point.
params: A JSON object (possibly empty) containing parameters for the method
id: A unique ID for the request. You must return this same ID in the response
jsonrpc: This will always be the string
result: A JSON object that is the value you are returning from the method call. Can be empty, depending on the method
id: The same
idthat was passed in to the request
jsonrpc: This will always be the string
sync_datamethod) within this time, you can use the
get_sync_speedmethod to tell Census to send data more slowly until you are able to complete within this timeout.
sync_batchresult in the data being actually persisted to the destination system - once you tell Census a record has been synced, Census may never send that record again (if it doesn't change in the source) and Census has know way to know that it was "lost" from the destination system
sync_batch, provide the ability to return application-level error messages. For
test_connection, you can return a high-level error messsage helping the user debug why your Custom API may not be working.
sync_batchrequires your Custom API to indicate a success or failure status for each record in the batch - error messages associated with record-level failures will be displayed in the Census Sync History UI
can_create_fieldsproperty is set to
on_write. This is useful for objects which can have arbitrary properties, like events. Census allows you to map new fields in the Sync creation UI but it's on your API implementation to support ingesting and creating these new fields when they are received. Being able to create new fields is a requisite for an object supporting Sync All Properties.
true, or it cannot be the destination of a Census sync.
field_api_name(string): A unique, unchanging identifier for this field
label(string): A human-friendly name for the field
identifer(boolean): If true, this field can act as a shared identifier in a Census sync. In order to be used an an identifier, a field must fulfill a few constraints:
required(boolean): If true, a record cannot be created unless a values is specified for this field. Census will enforce this by requiring a mapping from a data warehouse column to this field before a sync can be performed.
createable(boolean): If true, this field can be populated on record creation in the destination SaaS. This will be true for most fields. An example of a non-creatable field would be something like an auto-populated “Created At” timestamp that you’re not able to write to using the SaaS API.
updateable(boolean): Similar to
createable- if true, this field can be populated when updating an existing record. Generally speaking, if a field is neither
updateable, you might consider omitting it entirely from the
list_fieldsresponse, as it won’t be usable by Census for any syncs.
type(string): The data type for this field. Census uses this to plan any “type casts” required to translate data from your data warehouse to your SaaS, to warn of invalid or lossy type casts, and will format the data on the wire to your custom connector using this type information. If
true, the type must be
integer. See the table below for the full list of types.
array(boolean): If true, this field accepts an array of values instead of a single value. Any type can be used as an
arraytypes cannot be
identifiers. Census will require array fields to have a matching array field in the mapped data warehouse column.
numericdata warehouse columns
upsert: Look for a matching record in the SaaS - if it is present, update its fields with the new values, if not, create it. This is the most commonly used operation.
insert: If a matching record is not present in the SaaS, create it. If it is present, skip syncing this record.
update: If a matching record is present in the SaaS, update it with new field values. If it is not present, do not create it and skip syncing it.
maximum_batch_size: How many records Census should include in each call to
sync_batch(see below). Census may send smaller batches than this, but it will never send larger batches. If your SaaS API has a batch API, you should strongly consider setting this value to match you SaaS’s maximum batch size. If your SaaS does not have a batch import API, we recommend setting this to a relatively low value (under 100) then testing the performance of your custom connector with different batch sizes and increasing the value if needed for increased throughput.
maximum_parallel_batches: How many simultaneous invocations of
sync_batchCensus will perform. It’s generally safe to set this to a large number and control your sync speed using the other two variables, but if your underlying infrastructure (web server or function-as-a-service provider) limits the number of parallel calls to your function, you should use this parameter to stay under that limit and avoid queueing at the network layer.
maximum_records_per_second: How many records (not batches) Census will send to you custom connector per second, across all invocations. This should be matched to your SaaS API’s maximum records per secord, possibly with some buffer to allow for measurement error. The actual records per second may be less than this value, depending on
maximum_parallel_batches, and the average time it takes for your connector to complete one batch.