LogoLogo
  • 🦩Overview
  • 💾Datasets
    • Overview
    • Core Concepts
      • Columns & Annotations
      • Type & Property Mappings
      • Relationships
    • Basic Datasets
      • dbt Integration
      • Sigma Integration
      • Looker Integration
    • SaaS Datasets
    • CSV Datasets
    • Streaming Datasets
    • Entity Resolution
    • AI Columns
      • AI Prompts Recipe Book
    • Enrichment Columns
      • Quick Start
      • HTTP Request Enrichments
    • Computed Columns
    • Version Control
  • 📫Syncs
    • Overview
    • Triggering & Scheduling
    • Retry Handling
    • Live Syncs
    • Audience Syncs
    • Observability
      • Current Sync Run Overview
      • Sync History
      • Sync Tracking
      • API Inspector
      • Sync Alerts
      • Observability Lake
      • Datadog Integration
      • Warehouse Writeback
      • Sync Lifecycle Webhooks
      • Sync Dry Runs
    • Structuring Data
      • Liquid Templates
      • Event Syncs
      • Arrays and Nested Objects
  • 👥Audience Hub
    • Overview
    • Creating Segments
      • Segment Priorities
      • Warehouse-Managed Audiences
    • Experiments and Analysis
      • Audience Match Rates
    • Activating Segments
    • Calculated Columns
    • Data Preparation
      • Profile Explorer
      • Exclusion Lists
  • 🧮Data Sources
    • Overview
    • Available Sources
      • Amazon Athena
      • Amazon Redshift
      • Amazon S3
      • Azure Synapse
      • ClickHouse
      • Confluent Cloud
      • Databricks
      • Elasticsearch
      • Kafka
      • Google AlloyDB
      • Google BigQuery
      • Google Cloud SQL for PostgreSQL
      • Google Pub/Sub
      • Google Sheets
      • Greenplum
      • HTTP Request
      • HubSpot
      • Materialize
      • Microsoft Fabric
      • MotherDuck
      • MySQL
      • PostgreSQL
      • Rockset
      • Salesforce
      • SingleStore
      • Snowflake
      • SQL Server
      • Trino
  • 🛫Destinations
    • Overview
    • Available Destinations
      • Accredible
      • ActiveCampaign
      • Adobe Target
      • Aha
      • Airship
      • Airtable
      • Algolia
      • Amazon Ads DSP (AMC)
      • Amazon DynamoDB
      • Amazon EventBridge
      • Amazon Pinpoint
      • Amazon Redshift
      • Amazon S3
      • Amplitude
      • Anaplan
      • Antavo
      • Appcues
      • Apollo
      • Asana
      • AskNicely
      • Attentive
      • Attio
      • Autopilot Journeys
      • Azure Blob Storage
      • Box
      • Bloomreach
      • Blackhawk
      • Braze
      • Brevo (formerly Sendinblue)
      • Campaign Monitor
      • Canny
      • Channable
      • Chargebee
      • Chargify
      • ChartMogul
      • ChatGPT Retrieval Plugin
      • Chattermill
      • ChurnZero
      • CJ Affiliate
      • CleverTap
      • ClickUp
      • Constant Contact
      • Courier
      • Criteo
      • Crowd.dev
      • Customer.io
      • Databricks
      • Delighted
      • Discord
      • Drift
      • Drip
      • Eagle Eye
      • Emarsys
      • Enterpret
      • Elasticsearch
      • Facebook Ads
      • Facebook Product Catalog
      • Freshdesk
      • Freshsales
      • Front
      • FullStory
      • Gainsight
      • GitHub
      • GitLab
      • Gladly
      • Google Ads
        • Customer Match Lists (Audiences)
        • Offline Conversions
      • Google AlloyDB
      • Google Analytics 4
      • Google BigQuery
      • Google Campaign Manager 360
      • Google Cloud Storage
      • Google Datastore
      • Google Display & Video 360
      • Google Drive
      • Google Search Ads 360
      • Google Sheets
      • Heap.io
      • Help Scout
      • HTTP Request
      • HubSpot
      • Impact
      • Insider
      • Insightly
      • Intercom
      • Iterable
      • Jira
      • Kafka
      • Kevel
      • Klaviyo
      • Kustomer
      • Labelbox
      • LaunchDarkly
      • LinkedIn
      • LiveIntent
      • Loops
      • Mailchimp
      • Mailchimp Transactional (Mandrill)
      • Mailgun
      • Marketo
      • Meilisearch
      • Microsoft Advertising
      • Microsoft Dynamics
      • Microsoft SQL Server
      • Microsoft Teams
      • Mixpanel
      • MoEngage
      • Mongo DB
      • mParticle
      • MySQL
      • NetSuite
      • Notion
      • OneSignal
      • Optimizely
      • Oracle Database
      • Oracle Eloqua
      • Oracle Fusion
      • Oracle Responsys
      • Orbit
      • Ortto
      • Outreach
      • Pardot
      • Partnerstack
      • Pendo
      • Pinterest
      • Pipedrive
      • Planhat
      • PostgreSQL
      • PostHog
      • Postscript
      • Productboard
      • Qualtrics
      • Radar
      • Reddit Ads
      • Rokt
      • RollWorks
      • Sailthru
      • Salesforce
      • Salesforce Commerce Cloud
      • Salesforce Marketing Cloud
      • Salesloft
      • Segment
      • SendGrid
      • Sense
      • SFTP
      • Shopify
      • Singular
      • Slack
      • Snapchat
      • Snowflake
      • Split
      • Sprig
      • Stripe
      • The Trade Desk
      • TikTok
      • Totango
      • Userflow
      • Userpilot
      • Vero Cloud
      • Vitally
      • Webhooks
      • Webflow
      • X Ads (formerly Twitter Ads)
      • Yahoo Ads (DSP)
      • Zendesk
      • Zoho CRM
      • Zuora
    • Custom & Partner Destinations
  • 📎Misc
    • Credits
    • Census Embedded
    • Data Storage
      • Census Store
        • Query Census Store from Snowflake
        • Query Census Store locally using DuckDB
      • General Object Storage
      • Bring Your Own Bucket
        • Bring your own S3 Bucket
        • Bring your own GCS Bucket
        • Bring your own Azure Bucket
    • Developers
      • GitLink
      • Dataset API
      • Custom Destination API
      • Management API
    • Security & Privacy
      • Login & SSO Settings
      • Workspaces
      • Role-based Access Controls
      • Network Access Controls
      • SIEM Log Forwarding
      • Secure Storage of Customer Credentials
      • Digital Markets Act (DMA) Consent for Ad Platforms
    • Health and Usage Reporting
      • Workspace Homepage
      • Product Usage Dashboard
      • Observability Toolkit
      • Alerts
    • FAQs
Powered by GitBook
On this page
  • What Are Events?
  • Defining Event Syncs
  • Mapping Event Properties
  • Using the Properties Bundle

Was this helpful?

  1. Syncs
  2. Structuring Data

Event Syncs

A common type of user data is their "events", records of each action they take. See how to define event data and sync it to event-based destinations.

What Are Events?

Unlike user profile records, event data is tied to specific actions, almost always associated with a user and/or company. Events typically have a specific set of fields

  • Unique ID - Often used by services to make sure events are not submitted multiple times

  • Timestamp - The time the event occurred

  • Event Name / Type - Indication of the type of event that happened at that time

  • User and/or Company ID - The thing that caused the event to happen or that the event happened to.

  • And then optionally some properties about the event, often different properties for each type

In your data model, each event should be represented as a single row/record in the data source. When Census detects a new row in the data source, it will pass it to the destination service as an event/action using the parameters you've configured.

Defining Event Syncs

Census supports an ever-growing set of destinations that accept event-style data. Often times these are simply labeled as "Events" but some destination services, such as Delighted or Webhooks work like events in order to send NPS Surveys and Webhook API messages respectively.

To send event data, you'll create a new sync that uses the Send sync behavior. Census will treat your event data as an ever-growing log of data and whenever new rows appear in your data source, will send them over.

The Send event sync gives you a few special behaviors to optimize your sync

Backfill or just look forward

When setting up a Send behavior sync, Census will ask if you'd like to backfill the events already in the data source or simply sync going forward. This option lets you skip any existing data in the data source and only sync new event rows as they appear.

Using timestamp to identify new records

This feature is meant to pick up fast-changing data, to enable syncing many events as fast as possible. If syncing to Slack, or with low volumes of data, we do not recommend not using this setting

By default, Census uses the Event ID to detect which rows are new and should be synced. This is fine for most use cases but when data sources expand beyond tens of millions or rows, it may be faster to use a timestamp if available.

How this setting works upon syncing is as follows:

  1. Upon the first sync run Census will identify the max date in the column you specified and use it as a "high watermark" (ORDER_TIME in the example above)

  2. The first sync run will then send all records with a timestamp prior to the the watermark if you choose to Backfill All Records, or it will skip all records with a timestamp prior to the watermark if you select Skip Current Records

  3. On the next run of the sync (and for all subsequent runs) Census will identify the new "high watermark" (new max date from your timestamp field) and syncs the records that are less than the new watermark but greater than or equal to the watermark from the previous run

Identify new records by timestamp is available on Google BigQuery, Snowflake, and Redshift data warehouses.

Tracking data history

Mapping Event Properties

With your event sync ready to go, the next step is to let Census know how to handle the rest of the event properties. In this case, you have three options depending on the shape your data.

  1. Mapping all remaining columns to the destination automatically - Best for models that represent only that event type.

  2. Mapping each column to a destination property individually - Best for models with lots of columns where only a subset are needed for the actual event.

  3. (Where available) Using the properties bundle shortcut - Best for models that contain multiple different event types in the same data set.

Mapping all columns automatically

Nearly all event destinations support custom or dynamic properties, meaning the service will let Census send whichever properties you'd like, not just the default ones. In this case, you can let Census automatically sync all the remaining columns in your data model and as new columns are added, the fields will be automatically added to your sync/passed along as well.

Mapping specific columns

This is the standard method of defining syncs in Census. You'll add mapping entries for each column you'd like to send to the destination. Take advantage of the Create New Field option when setting up mappings as well.

Note: Most event services do not let Census know which properties are already used in the destination so you'll need to provide the list of destination properties yourself if you want to reuse existing fields or events. Keep in mind capitalization!

Using the Properties Bundle

Some event destinations such as Amplitude and Mixpanel support a Properties Bundle shortcut which accepts a Arrays and Nested Objects object column containing the keys you'd like to send to the destination. The benefit of this approach is that your data model can include dynamic sets of properties for different event types in a single sync.

PreviousLiquid TemplatesNextArrays and Nested Objects

Last updated 5 days ago

Was this helpful?

Census will keep track of events that have been synced by ID (or timestamp) for the life of the sync configuration to avoid syncing them more than once. As a result, the typical Force Full Sync option is not available on event syncs, and event syncs require data warehouses that support writing state. As such, are not supported at this time.

📫
read-only data sources
This approach works best when your data model is 1:1 with the event data you want to send, but is not a good approach if, for some reason, it contains other columns that you don't wish to include in the event. In that case, you'll want to explicitly select which fields to sync.