Perspectium Replicator uses Scheduled Jobs to process inbound and outbound replication data.
The MultiOutput Processing Scheduled Job is used to send outbound replication data from Salesforce to the Perspectium MBS. Once scheduled, the job runs once every minute. To begin replicating shared data you must configure and activate an organization of the MultiOutput Processing job by doing the following:
The Replicator Subscriber Scheduled Job is used to process inbound replication data from the Perspectium MBS to Salesforce. Once scheduled, the job will run once every minute. To begin subscribing to shared data you must configure and activate an organization of the Replicator Subscriber Job by doing the following:
Shared Queue configurations are used to transfer data out of Salesforce to a named queue in the Perspectium MBS. If a Shared Queue configuration is not specified for a Dynamic or Bulk Share, the messages are sent to the default psp.in.salesforce queue in Perspectium MBS and routing rules are applied. To create a new Shared Queue configuration:
An “Alias” field has been added to the Queue configuration form for use with dynamic share Apex triggers. By specifying an alias for a queue, a dynamic share trigger will be created referencing this alias instead of the queue's ID as is currently the case.
This will allow you to more easily move Apex Triggers from sandbox/dev to production. Because Apex triggers cannot be easily modified in production (Salesforce has you create/modify the trigger in sandbox/dev first and then move them over to production), you can use the Alias field so as to not have to modify the trigger after testing in sandbox/dev and before moving in production.
For example, you can create a test queue in your sandbox with the alias “db” and then create a different queue in production with the same alias “db”. Then when you create the Apex trigger in sandbox and test to verify everything works as expected, you don't have to modify the trigger again with the production queue's ID since both have the same alias.
Aliases are only relevant for dynamic shares since bulk shares are run with background jobs that do not require Apex triggers.
Subscribe Queues configurations are used to transfer data from a named queue in the Perspectium MBS into Salesforce. To create a new Subscribe Queue configuration:
Note: When sharing from ServiceNow to Salesforce, please use AES128 encryption since Salesforce currently does not support TripleDES encryption.
Setting up a Bulk Share allows a Salesforce organization to share a pre-filtered range of table data at once. The consumer or subscriber of this data can be another organization in Salesforce or a Perspectium Agent. You will need to create a bulk share configuration for each table you want to share.
Bulk Shares always result in an attempt to update the record before inserting it to the consumer. The update uses the Id field value when available. You can override this behavior by selecting the “Insert only” option as defined below. Selecting the “Insert only” assumes that no records currently exist in the destination table.
To create a Bulk Share click on the Bulk Shares tab. Next click New Bulk Share and enter the following:
To cancel a running Bulk Share perform the following steps:
On the Bulk Share main page, the user can find the Bulk Share details history and statuses of Bulk Shares
Setting up a Dynamic Share configuration allows a Salesforce organization to share table data in real time. The consumer or subscriber of this data can be another organization of Salesforce or a Perspectium Agent. You will need to create a share configuration for each table you want to share.
To create a Dynamic Share configuration to begin near real-time replication of table data click on the Dynamic Shares tab. Click New Dynamic Share and enter the following:
The Apex Trigger Details section contains a read-only version of the Apex Trigger code generated when the Dynamic Share configuration is saved. This can be used to manually create the Apex Trigger if the Tooling API is not available for dynamic Apex Trigger creation. Salesforce does not allow dynamic creation of Apex Triggers in production organizations. Dynamic Share triggers should normally be migrated from sub-production organizationss using change control process. To perform this task manually, click the Go to Trigger List link and create or update the trigger using the content in the Apex Trigger Details section.
Instead of deleting and recreating an Apex Trigger each time a dynamic share is modified, triggers are now updated with the dynamic share triggers to allow you to easily move changes to your production organizations. Note that because attachment triggers are not changed, attachment triggers will continue to be deleted and recreated when you select the “Include Attachments” option on a dynamic share.
Because of how Salesforce moves change sets between development and production organizations (you can move modified objects, but removing items from production that were deleted in development requires the user to do so through a developer environment plugin like Eclipse), an Active field has been added for when you want to turn off a dynamic share but may want to turn it back on later on.
When Active is selected, the Apex Trigger is saved with the trigger's content as normal. When Active is unselected, the Apex Trigger will be updated to be blank (no actions or operations) so as to not share out any records. This way you can turn off a dynamic share without having to delete it.
As well, the “Include attachments” Apex Trigger will do the same thing; when selected, the attachment Apex Trigger is created with the necessary actions to share attachments and when unselected, the attachment Apex Trigger will update to be blank so it doesn't have to be deleted and can be turned back on later when this option is selected again.
Deleting a dynamic share will delete both Apex Triggers on the table and attachments.
The SQL Subscriber agent can be configured to consume replicated Salesforce records and write them to a database table. The agent.xml file shown below is an example of the necessary configuration.
Note: The <schema_connection> password attribute is your salesforce password with your security token appended to it.
<?xml version="1.0" encoding="ISO-8859-1" standalone="no"?> <config> <agent> <max_reads_per_connect>4000</max_reads_per_connect> <polling_interval>30</polling_interval> <subscribe> <task> <task_name>salesForce</task_name> <message_connection queue="psp.out.replicator.mysalesforce" password="encrypted:iTOCp4deP7rJNgFkf2AEMA==" user="admin">amqp://perspectiumdemo1-amqp.perspectium.net</message_connection> <handler>com.perspectium.replicator.sql.SQLSubscriber</handler> <decryption_key>The cow jumped over the moon</decryption_key> <database_type>mysql</database_type> <database_server>localhost</database_server> <database_port>3306</database_port> <database_user>root</database_user> <database_password></database_password> <database_params>characterEncoding=UTF-8</database_params> <database>solutions</database> <schema_connection user="email@example.com" password="salesforce1NxOwgnOlIH11JIfXC9Ob57Dc" client_id="3MVG9uudbyLbNPZO0CYuZCvodrGD7QFWe6aPZY54i2cn4BqjhcuaSCYM2hgWI297gwSCr.5XEfy3WlyB4zXcI" client_secret="8290250609080749993">https://login.salesforce.com/services/oauth2/token</schema_connection> <primary_key>Id</primary_key> <skip_database_creation/> <date_time_format>yyyy-MM-dd'T'HH:mm:ss.SSS'Z'</date_time_format> </task> </subscribe> </agent> </config>
To read Salesforce attachments into ServiceNow, the u_sfdc_attachment_import import set table is provided as part of the Perspectium Salesforce Update Set for ServiceNow.
The update set comes with a subscribe configuration for u_sfdc_attachment_import so records can be read into the import set table. A script action will then run on records being inserted to properly add and delete attachments.
The Perspectium message that is shared out of Salesforce for an attachment will come with a “ParentTable” field that has the name of the Salesforce table that you can then use to determine which table this should map to in ServiceNow.
The Set parent table to attach to business rule on the u_sfdc_attachment_import table is provided so you can modify the corresponding field in the import set table (u_parenttable) to the appropriate ServiceNow table based on how you are subscribing Salesforce records into ServiceNow.
For example, if you are subscribing Salesforce Case records into ServiceNow Incident records, you can use the business rule to update the field as follows:
if (current.u_parenttable == 'Case') current.u_parenttable = 'Incident';