Pardot Extract
    • Dark
      Light

    Pardot Extract

    • Dark
      Light

    Article Summary

    This article is specific to the following platforms - Snowflake - Redshift - BigQuery.

    Warning

    From version 1.53 of Matillion ETL, the original Pardot Extract component has been deprecated. This connector will still be available for existing use (users who already have this component included in their Orchestration Jobs).

    Additionally, from version 1.53 of Matillion ETL, a new version of the Pardot Extract component is available for use. This component uses an OAuth property for authentication. The User Key, Email Address, and Password properties have all been deprecated.

    Pardot Extract

    The Pardot Extract component calls the Salesforce Pardot API to retrieve and store data to be either referenced by an external table or loaded into a table, depending on the user's cloud data warehouse. Users can then transform their data with the Matillion ETL library of transformation components.

    Using this component may return structured data that requires flattening. For help with flattening such data, we recommend using the Nested Data Load Component for Amazon Redshift and the Extract Nested Data Component for Snowflake or Google BigQuery.

    Properties

    Snowflake Properties

    PropertySettingDescription
    NameStringA human-readable name for the component.
    Data SourceSelectSelect a data source. As noted above, once you have configured the Data Source property, one or more properties specific to that data source will become available to configure. These properties are not optional and must be configured.
    Please refer to the "Data Source Properties" table in this documentation for guidance with these additional properties.
    Auth MethodSelectSelect the authentication method. Users can choose OAuth or User Key. OAuth will then require an OAuth entry in the OAuth property; User Key will require completion of the User Key, Email Address, and Password properties.
    OAuthSelectSelect an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorise a fresh OAuth, please read Pardot Authentication Guide.
    User KeyString(Deprecated component only as of v1.53) The user key string that corresponds to the email address and password credentials that are used to log in to Pardot. For help acquiring a Pardot user key, read Pardot Authentication Guide.
    Email AddressString(Deprecated component only as of v1.53) Please provide the email address for your Salesforce Pardot login.
    PasswordString(Deprecated component only as of v1.53) Please provide the password for your Salesforce Pardot login. Passwords can be stored inside the component; however, it is highly recommend to use the Password Manager feature instead.
    Business Unit IDStringInput the Pardot Business Unit ID for your Pardot account. For more information, read Authentication.
    Page LimitIntegerSet the page limit for the amount of records to be returned and staged. You can use -1 to attempt to take all available data, but please be warned that this might take some time.
    LocationStorage LocationProvide an S3 bucket path, GCS bucket path, or Azure Blob Storage path that will be used to store the data. Once on an S3 bucket, GCS bucket or Azure Blob, the data can be referenced by an external table. A folder will be created at this location with the same name as the Target Table.
    IntegrationSelect(GCP only) Choose your Google Cloud Storage Integration. Integrations are required to permit Snowflake to read data from and write to a Google Cloud Storage bucket. Integrations must be set up in advance of selecting them in Matillion ETL. To learn more about setting up a storage integration, read our Storage Integration Setup Guide.
    WarehouseSelectChoose a Snowflake warehouse that will run the load.
    DatabaseSelectChoose a database to create the new table in.
    SchemaSelectSelect the table schema. The special value, [Environment Default], will use the schema defined in the environment. For more information on using multiple schemas, please refer to this article.
    Target TableStringProvide a new table name.
    Warning: This table will be recreated and will drop any existing table of the same name.

    Redshift Properties

    PropertySettingDescription
    NameStringA human-readable name for the component.
    Data SourceSelectSelect a data source. As noted above, once you have configured the Data Source property, one or more properties specific to that data source will become available to configure. These properties are not optional and must be configured.
    Please refer to the "Data Source Properties" table in this documentation for guidance with these additional properties.
    Auth MethodSelectSelect the authentication method. Users can choose OAuth or User Key. OAuth will then require an OAuth entry in the OAuth property; User Key will require completion of the User Key, Email Address, and Password properties.
    OAuthSelectSelect an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorise a fresh OAuth, please read Pardot Authentication Guide.
    User KeyString(Deprecated component only as of v1.53) The user key string that corresponds to the email address and password credentials that are used to log in to Pardot. For help acquiring a Pardot user key, read Pardot Authentication Guide.
    Email AddressString(Deprecated component only as of v1.53) Please provide the email address for your Salesforce Pardot login.
    PasswordString(Deprecated component only as of v1.53) Please provide the password for your Salesforce Pardot login. Passwords can be stored inside the component; however, it is highly recommend to use the Password Manager feature instead.
    Business Unit IDStringInput the Pardot Business Unit ID for your Pardot account. For more information, read Authentication.
    Page LimitIntegerSet the page limit for the amount of records to be returned and staged. You can use -1 to attempt to take all available data, but please be warned that this might take some time.
    LocationStorage LocationProvide an S3 Bucket path that will be used to store the data. Once on an S3 bucket, the data can be referenced by an external table. A folder will be created at this location with the same name as the Target Table.
    TypeDropdownSelect between a standard table and an external table.
    Standard SchemaDropdownSelect the Redshift schema. The special value, [Environment Default], will use the schema defined in the Matillion ETL environment.
    External SchemaSelectSelect the table's external schema. To learn more about external schemas, please consult the Configuring The Matillion ETL Client section of the Getting Started With Amazon Redshift Spectrum documentation.
    Target TableStringProvide a name for the external table to be used.
    Warning: This table will be recreated and will drop any existing table of the same name.

    BigQuery Properties

    PropertySettingDescription
    NameStringA human-readable name for the component.
    Data SourceSelectSelect a data source. As noted above, once you have configured the Data Source property, one or more properties specific to that data source will become available to configure. These properties are not optional and must be configured.
    Please refer to the "Data Source Properties" table in this documentation for guidance with these additional properties.
    Auth MethodSelectSelect the authentication method. Users can choose OAuth or User Key. OAuth will then require an OAuth entry in the OAuth property; User Key will require completion of the User Key, Email Address, and Password properties.
    OAuthSelectSelect an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorise a fresh OAuth, please read Pardot Authentication Guide.
    User KeyString(Deprecated component only as of v1.53) The user key string that corresponds to the email address and password credentials that are used to log in to Pardot. For help acquiring a Pardot user key, read Pardot Authentication Guide.
    Email AddressString(Deprecated component only as of v1.53) Please provide the email address for your Salesforce Pardot login.
    PasswordString(Deprecated component only as of v1.53) Please provide the password for your Salesforce Pardot login. Passwords can be stored inside the component; however, it is highly recommend to use the Password Manager feature instead.
    Business Unit IDStringInput the Pardot Business Unit ID for your Pardot account. For more information, read Authentication.
    Page LimitIntegerSet the page limit for the amount of records to be returned and staged. You can use -1 to attempt to take all available data, but please be warned that this might take some time.
    Table TypeSelectSelect whether the table is Native (by default in BigQuery) or an external table.
    ProjectSelectSelect the Google Bigquery project. The special value, [Environment Default], will use the project defined in the environment.
    For more information, refer to the BigQuery documentation.
    DatasetSelectSelect the Google Bigquery dataset to load data into. The special value, [Environment Default], will use the dataset defined in the environment.
    For more information, refer to the BigQuery documentation.
    Target TableStringA name for the table.
    Warning: This table will be recreated and will drop any existing table of the same name.
    Only available when the table type is Native.
    New Target TableStringA name for the new external table.
    Only available when the table type is External.
    Cloud Storage Staging AreaCloud Storage BucketSpecify the target Google Cloud Storage bucket to be used for staging the queried data. Users can either:
    1. Input the URL string of the Cloud Storage bucket following the template provided: gs://<bucket>/<path>
    2. Navigate through the file structure to select the target bucket.

    Only available when the table type is Native.
    LocationCloud Storage BucketSpecify the target Google Cloud Storage bucket to be used for staging the queried data. Users can either:
    1. Input the URL string of the Cloud Storage bucket following the template provided: gs://<bucket>/<path>
    2. Navigate through the file structure to select the target bucket.
    Only available when the table type is External.
    Load OptionsMultiple SelectClean Cloud Storage Files: Destroy staged files on Cloud Storage after loading data. Default is On.
    Cloud Storage File Prefix: Give staged file names a prefix of your choice. The default setting is an empty field.
    Recreate Target Table: Choose whether the component recreates its target table before the data load. If Off, the component will use an existing table or create one if it does not exist. Default is On.
    Use Grid Variable: Check this checkbox to use a grid variable. This box is unchecked by default.

    Data Source Properties

    The following table lists any Data Source that requires one or more unique component properties for configuration. If a Data Source is missing from this table, it does NOT have any unique component properties.

    Data SourceProperty NameTypeDescription
    Email StatsList Email IDIntegerA single Email ID values.
    EmailsEmail IDIntegerA single Email ID value.
    ListCreated beforeDateInclude records created before a given GNU format data input.
    Created afterDateInclude records created after a given GNU format data input
    Updated beforeDateInclude records updated before a given GNU format data input
    Updated afterDateInclude records updated after a given GNU format data input
    List MembershipCreated beforeDateInclude records created before a given GNU format data input.
    Created afterDateInclude records created after a given GNU format data input
    Updated beforeDateInclude records updated before a given GNU format data input
    Updated afterDateInclude records updated after a given GNU format data input
    ProspectsCreated beforeDateInclude records created before a given GNU format data input.
    Created afterDateInclude records created after a given GNU format data input
    Updated beforeDateInclude records updated before a given GNU format data input
    Updated afterDateInclude records updated after a given GNU format data input
    Specific visitor detailsIDIntegerA single ID value or a comma-separated list of visitor ID values.
    Visitor ActivityCreated beforeDateInclude visitor activity records created before a given GNU format data input.
    Created afterDateInclude visitor activity records created after a given GNU format data input
    Updated beforeDateInclude visitor activity records updated before a given GNU format data input
    Updated afterDateInclude visitor activity records updated after a given GNU format data input
    VisitorsOnly identified visitorstrue/falseOnly include identified visitors in this query.
    Created beforeDateInclude records created before a given GNU format data input.
    Created afterDateInclude records created after a given GNU format data input
    Updated beforeDateInclude records updated before a given GNU format data input
    Updated afterDateInclude records updated after a given GNU format data input
    VisitsCreated beforeDateInclude records created before a given GNU format data input.
    Created afterDateInclude records created after a given GNU format data input
    Updated beforeDateInclude records updated before a given GNU format data input
    Updated afterDateInclude records updated after a given GNU format data input
    IDsIntegerA single ID value or comma-separated list of ID values.
    Visitor IDsIntegerA single ID value or comma-separated list of Visitor ID values.
    Prospect IDsIntegerA single ID value or comma-separated list of Prospect ID values.

    How to Obtain Your User Key

    1. Log in to your Salesforce Pardot account.
    2. Via the left-hand menu, go to Admin → User Management → Users.
    3. Click on the name of the user you want the User Key for.
    4. Copy the entry beside "API User Key".