User Tools

Site Tools


config_sf_replicator

Configuration of Replicator for Salesforce


Scheduling Jobs

Perspectium Replicator uses Scheduled Jobs to process inbound and outbound replication data.

MultiOutput Processing Scheduled Job

The MultiOutput Processing Scheduled Job is used to send outbound replication data from Salesforce to the Perspectium MBS. Once scheduled, the job runs once every minute. To begin replicating shared data you must configure and activate an organization of the MultiOutput Processing job by doing the following:

  1. Click on the Perspectium Jobs tab
  2. Click New Perspectium Job
  3. Select MultiOutput Processing from the Job Type list
  4. Check the Active checkbox
  5. Save

Replicator Subscriber Scheduled Job

The Replicator Subscriber Scheduled Job is used to process inbound replication data from the Perspectium MBS to Salesforce. Once scheduled, the job will run once every minute. To begin subscribing to shared data you must configure and activate an organization of the Replicator Subscriber Job by doing the following:

  1. Click on the Perspectium Jobs tab
  2. Click New Perspectium Job
  3. Select Replicator Subscriber from the Job Type list
  4. Check the Active checkbox
  5. Save

Queue Configuration

Shared Queues

Shared Queue configurations are used to transfer data out of Salesforce to a named queue in the Perspectium MBS. If a Shared Queue configuration is not specified for a Dynamic or Bulk Share, the messages are sent to the default psp.in.salesforce queue in Perspectium MBS and routing rules are applied. To create a new Shared Queue configuration:

  1. Click on the Queues tab
  2. Click the New Queue button
  3. Fill in the Queue Name (ex: psp.out.salesforce.test)
  4. Fill in an Alias name for this queue if so desired (see below)
  5. Enter the Endpoint URL to the Perspectium MBS (ex:https://yourcompany.perspectium.net)
  6. Enter the Username for your Perspectium MBS account
  7. Enter the Password for your Perspectium MBS account
  8. Select the Direction of Share
  9. Save

v3.26.0

An “Alias” field has been added to the Queue configuration form for use with dynamic share Apex triggers. By specifying an alias for a queue, a dynamic share trigger will be created referencing this alias instead of the queue's ID as is currently the case.

This will allow you to more easily move Apex Triggers from sandbox/dev to production. Because Apex triggers cannot be easily modified in production (Salesforce has you create/modify the trigger in sandbox/dev first and then move them over to production), you can use the Alias field so as to not have to modify the trigger after testing in sandbox/dev and before moving in production.

For example, you can create a test queue in your sandbox with the alias “db” and then create a different queue in production with the same alias “db”. Then when you create the Apex trigger in sandbox and test to verify everything works as expected, you don't have to modify the trigger again with the production queue's ID since both have the same alias.

Aliases are only relevant for dynamic shares since bulk shares are run with background jobs that do not require Apex triggers.

Subscribed Queues

Subscribe Queues configurations are used to transfer data from a named queue in the Perspectium MBS into Salesforce. To create a new Subscribe Queue configuration:

  1. Click on the Queues tab
  2. Click the New Queue button
  3. Fill in the Queue Name (ex: psp.out.salesforce.test)
  4. Enter the Endpoint URL to the Perspectium MBS (ex:https://yourcompany.perspectium.net)
  5. Enter the Username for your Perspectium MBS account
  6. Enter the Password for your Perspectium MBS account
  7. Select the Direction of Subscribe
  8. Save

Note: When sharing from ServiceNow to Salesforce, please use AES128 encryption since Salesforce currently does not support TripleDES encryption.

Bulk Share Configuration

Setting up a Bulk Share allows a Salesforce organization to share a pre-filtered range of table data at once. The consumer or subscriber of this data can be another organization in Salesforce or a Perspectium Agent. You will need to create a bulk share configuration for each table you want to share.

Bulk Shares always result in an attempt to update the record before inserting it to the consumer. The update uses the Id field value when available. You can override this behavior by selecting the “Insert only” option as defined below. Selecting the “Insert only” assumes that no records currently exist in the destination table.

Creating a Bulk Share

To create a Bulk Share click on the Bulk Shares tab. Next click New Bulk Share and enter the following:

  1. Bulk Share Name (ex: All Accounts)
  2. Choose the Table you wish to share from (ex: Accounts)
  3. If using a named queue, choose the Shared Queue
  4. If you want to send a subset of the total table rows, enter the Where condition. Filter Fields can assist building the where clause.
  5. Save
  6. (Optional) Click the Count button to preview the number of records that will be shared.
  7. Click Execute Now to begin Bulk Sharing data

Cancelling a running Bulk Share

To cancel a running Bulk Share perform the following steps:

  1. Click the edit button for the Bulk Share you wish to cancel.
  2. Select Cancel from the Status dropdown.
  3. Click Save.

Bulk Share Main Page

On the Bulk Share main page, the user can find the Bulk Share details history and statuses of Bulk Shares

Dynamic Share Configuration

Setting up a Dynamic Share configuration allows a Salesforce organization to share table data in real time. The consumer or subscriber of this data can be another organization of Salesforce or a Perspectium Agent. You will need to create a share configuration for each table you want to share.

Creating a Dynamic Share

To create a Dynamic Share configuration to begin near real-time replication of table data click on the Dynamic Shares tab. Click New Dynamic Share and enter the following:

  1. Dynamic Share Name
  2. Select a Table to replicate data from (ex: Account)
  3. Check the Active checkbox if you want this dynamic share to be active v3.25.0
  4. Check the Create checkbox if you want to replicate inserts
  5. Check the Update checkbox if you want to replicate updates
  6. Check the Delete checkbox if you want to replicate deletes
  7. Check the Include Attachments checkbox if you want to include Attachments related to that record.
  8. If using a named queue, choose the Shared Queue
  9. If you wish to send a subset of the table fields, use the Fields to Share shuttle box.
  10. If you want to send a subset of the total table rows, enter the Where condition. Filter Fields can assist building the where clause.
  11. Click Save
  12. Click Create/Save Trigger

The Apex Trigger Details section contains a read-only version of the Apex Trigger code generated when the Dynamic Share configuration is saved. This can be used to manually create the Apex Trigger if the Tooling API is not available for dynamic Apex Trigger creation. Salesforce does not allow dynamic creation of Apex Triggers in production organizations. Dynamic Share triggers should normally be migrated from sub-production organizationss using change control process. To perform this task manually, click the Go to Trigger List link and create or update the trigger using the content in the Apex Trigger Details section.

v3.24.0
Instead of deleting and recreating an Apex Trigger each time a dynamic share is modified, triggers are now updated with the dynamic share triggers to allow you to easily move changes to your production organizations. Note that because attachment triggers are not changed, attachment triggers will continue to be deleted and recreated when you select the “Include Attachments” option on a dynamic share.

v3.25.0
Because of how Salesforce moves change sets between development and production organizations (you can move modified objects, but removing items from production that were deleted in development requires the user to do so through a developer environment plugin like Eclipse), an Active field has been added for when you want to turn off a dynamic share but may want to turn it back on later on.

When Active is selected, the Apex Trigger is saved with the trigger's content as normal. When Active is unselected, the Apex Trigger will be updated to be blank (no actions or operations) so as to not share out any records. This way you can turn off a dynamic share without having to delete it.

As well, the “Include attachments” Apex Trigger will do the same thing; when selected, the attachment Apex Trigger is created with the necessary actions to share attachments and when unselected, the attachment Apex Trigger will update to be blank so it doesn't have to be deleted and can be turned back on later when this option is selected again.

Deleting a dynamic share will delete both Apex Triggers on the table and attachments.

Example: SQL Subscriber Agent Configuration for Salesforce

The SQL Subscriber agent can be configured to consume replicated Salesforce records and write them to a database table. The agent.xml file shown below is an example of the necessary configuration.

Note: The <schema_connection> password attribute is your salesforce password with your security token appended to it.

<?xml version="1.0" encoding="ISO-8859-1" standalone="no"?>
<config>
    <agent>
      <max_reads_per_connect>4000</max_reads_per_connect>
      <polling_interval>30</polling_interval>
 
        <subscribe>
            <task>
                <task_name>salesForce</task_name>
                <message_connection queue="psp.out.replicator.mysalesforce" password="encrypted:iTOCp4deP7rJNgFkf2AEMA==" user="admin">amqp://perspectiumdemo1-amqp.perspectium.net</message_connection>
                <handler>com.perspectium.replicator.sql.SQLSubscriber</handler>
                <decryption_key>The cow jumped over the moon</decryption_key>
                <database_type>mysql</database_type>
                <database_server>localhost</database_server>
                <database_port>3306</database_port>
                <database_user>root</database_user>
                <database_password></database_password>
                <database_params>characterEncoding=UTF-8</database_params>
                <database>solutions</database>
 
                <schema_connection user="your_sfuser@yourcompany.com" password="salesforce1NxOwgnOlIH11JIfXC9Ob57Dc"
                    client_id="3MVG9uudbyLbNPZO0CYuZCvodrGD7QFWe6aPZY54i2cn4BqjhcuaSCYM2hgWI297gwSCr.5XEfy3WlyB4zXcI"
                    client_secret="8290250609080749993">https://login.salesforce.com/services/oauth2/token</schema_connection>
 
                <primary_key>Id</primary_key>
                <skip_database_creation/>
                <date_time_format>yyyy-MM-dd'T'HH:mm:ss.SSS'Z'</date_time_format>
            </task>
      </subscribe>
   </agent>
</config>

Salesforce Attachments into ServiceNow

v3.26.0
To read Salesforce attachments into ServiceNow, the u_sfdc_attachment_import import set table is provided as part of the Perspectium Salesforce Update Set for ServiceNow.

The update set comes with a subscribe configuration for u_sfdc_attachment_import so records can be read into the import set table. A script action will then run on records being inserted to properly add and delete attachments.

The Perspectium message that is shared out of Salesforce for an attachment will come with a “ParentTable” field that has the name of the Salesforce table that you can then use to determine which table this should map to in ServiceNow.

The Set parent table to attach to business rule on the u_sfdc_attachment_import table is provided so you can modify the corresponding field in the import set table (u_parenttable) to the appropriate ServiceNow table based on how you are subscribing Salesforce records into ServiceNow.

For example, if you are subscribing Salesforce Case records into ServiceNow Incident records, you can use the business rule to update the field as follows:

if (current.u_parenttable == 'Case')
    current.u_parenttable = 'Incident';
config_sf_replicator.txt · Last modified: 2017/08/06 22:53 by paul