AI Columns

AI Columns enable you to dynamically generate unique content for each row in your dataset using LLMs like ChatGPT (OpenAI), Claude (Anthropic) and Gemini (Google). With AI Columns, you can define a prompt and use liquid templating to reference values from other columns. This setup allows you to send a customized prompt request for each row, with the response automatically written back to your AI Column. The AI Columns materialize in your warehouse as well.

Try AI Columns for free using trial credits! No need for an API key until your trial credits run out.

Example Use Cases

  1. Automatically generate personalized email content or messages based on customer data.

  2. Generate insights or recommendations from transactional data, such as suggesting complementary products based on purchase history.

  3. Sentiment analysis of email received by sales team from outbound campaign to help with categorization and reporting

  4. Summarize product usage among specific features by “high” or “low” to identify upsell fits and run PLG playbooks

  5. Clean up data by removing special characters from a column

Checkout our Recipe Book for more examples and sample prompts.

Pre-requisites

  • Dataset should have a Unique ID column

Note : You will need your API key to connect a LLM Provider (OpenAI, Claude, Gemini) once you run out of Census credits.

  • To create a new OpenAI API key, log into OpenAI and navigate to Dashboard / API keys and generate a new Project API Key.

  • To create a new Anthropic API Key, navigate to Anthropic Console > Settings > API Keys and generate a new Key.

How to create a AI Column

If you are a video person, watch how to create a GPT column. Otherwise, follow the steps below.

Step 1: Log into your Census account.

Step 2: Navigate to the Datasets tab by clicking on Datasets in the left navigation panel.

Step 3: Choose a dataset where you want to add a new AI-based column. Make sure the Dataset has a Unique ID column assigned

Step 4: Select Enrich & Enhance on your top right corner, choose AI and your preferred LLM provider.

Census Create AI Column

Step 5: Skip this step if you have trial credits. Connect to selected platform (OpenAI, Anthropic, Google) using your API Key and click Next.

AI Columns Connect

Step 6: Create a prompt and fill out the column name.

Refer to our AI Prompts Recipe Book for some inspiration!

Census AI Column Prompt
  • Model Type - you can select from the provided list of models for the selected LLM provider.

  • The expected output type - there are several optional properties to help you guarantee data quality.

  • The prompt to run against each row of your data. Your prompt can leverage Liquid templating to reference column values.

Step 7: Hit the Create button and that's it. Census will generate a AI based column into your dataset.

This step can take several minutes. Behind the scene, Census sets up OpenAI/Anthropic/Google as a destination and runs a sync across all your rows in the selected dataset.

The AI columns refresh every 6 hours and only process new rows.

Warehouse Writeback

The results generated by AI Columns are stored directly in your source warehouse. Census creates a new table within the Census schema, prefixed with DATASET_COLUMN_, containing the AI Column.

This allows you to not only sync these AI-generated columns to your destination via Census but also explore them further within your warehouse.

Rate Limits

Requests made by Census to the LLM provider (ex. OpenAI) are subject to daily rate limits, which may cause the underlying sync to stall. Rate limits can typically be increased by upgrading the tier of your organization with the LLM provider.

For more information, please see the rate limit policies for your specific LLM provider.

Privacy and Security

Census only sends your prompt to the LLM provider. If your prompt includes specific dataset columns via liquid templates, these columns will be included as part of the prompt sent to the LLM provider. No other data is shared with the LLM.

Data sent via Census to the LLM provider is not used for training models. For more information, please refer each LLM provider's data usage policies.

All requests made to the LLM provider are made through secure HTTPS channels, and only successful responses are saved to your dataset.

Last updated

Was this helpful?