Working with Salesforce Outbound Messages in .NET

What are Outbound Messages?

Outbound Messages in Salesforce are another way for your application to receive notifications of changes to your Salesforce data. Imagine you have a scenario, where you would like your external application to be notified whenever a Lead record in Salesforce is created (or edited) and the City field on the Lead is set to Chicago. It is possible in Salesforce, using an Outbound Message which is triggered from a Workflow Rule like the one I just described. I’ll discuss in detail what both of those Salesforce features are in this post.

We have already discussed the Streaming API here in the past, but there are a few key differences to observe with Outbound Messages:

  • There are no API limitations around the number of Outbound Messages which can be sent. The Streaming API has a set number of events which are allocated to you.
  • Outbound Messages require you to expose a public URL which receives the notifications, and there are some security considerations to be aware of. The Streaming API does not require a public facing URL and can run behind a firewall.
  • Workflow Rules allow a finer set of criteria based on field filters, or a formula field, which can give you more control over which notifications your application receives.
  • There is no ‘Replay’ behavior in Outbound Messages and there is no guarantee that notifications will come in the order they occurred. You will receive an Id for each notification, but there is no way to replay notifications. Similarly, if a record is changed before your message is sent, it may only contain the latest data from the most recent change.

Important note: You should be cognizant of creating circular/infinite loops with Outbound Messages. If you create a Workflow Rule that says to create a message every time an account is changed, and that updates the opportunity, which fires a rule that updates the account, you may find yourself creating a difficult to troubleshoot scenario. To circumvent this problem, create a Profile in Salesforce which corresponds to a user which can receive outbound notifications, but has the “Send Outbound Messages” field set to off on the profile. This way the user can receive messages but updating changes will not trigger additional cascading notifications.

Defining an Outbound Message

Creating our Outbound Message in Salesforce will be a two step process. The first step will be to define the actual message. Think of this as defining the data model for the process. Later, when we discuss the Workflow Rule, this will be the controller aspect that determines when our messages are sent. Head over to your Setup menu inside your sandbox/developer org and use the Quick Find behavior to locate “Process Automation > Workflow Actions > Outbound Messages” (1). From the list of current outbound messages, select “New Outbound Message”. On the first screen, indicate we are creating a message for the Lead record type.

On the next screen, specify a name (2) for our message that is representative of what is being delivered in each notification. Also, provide a generic Endpoint URL (3). We are going to change this later, so just set this to a valid URL for now. Also, enable the Session ID (4) so we get a parameter which indicates who triggered the message being sent. Also, specify the fields you’d like to receive in your payload (5) when the notification is delivered. As a suggestion, try to limit this to just the fields you need. This will reduce the transmission time over the wire and also reduce the time to deserialize the messages as well.

Once the message has been saved/created, the final step we need to perform is to capture the Endpoint WSDL value. Click the link that says “Click for WSDL” and save the contents that results on your workstation with the name “leads.wsdl”.

Creating the Workflow Rule

The next step in our two step process, is to create a Workflow Rule which will trigger our Outbound Message to actually be delivered. In setup, enter “Workflow Rule” in the Quick Find and select “Process Automation > Workflow Rules”. On the screen that lists your current workflow rules, select the “New Rule” button and designate that we are creating a rule that works on the Lead object.

On the next screen, provide an informative name for your rule. In your evaluation criteria you have three options. When the records are created and meet criteria, when they are created or edited and meet criteria, and the final option to evaluate the record only when the record changes from it’s original state to meet the criteria (i.e. the city was Phoenix, but it is now Chicago and criteria is city = Chicago). We’ll select the second option (3) which runs every time our criteria is met, even if that criteria didn’t change.

For the rule criteria, we’re given a set of contextual (based on the object type we’re working with) parameters we can use to determine if our rule should be executed. We could switch this to a formula also, but that is outside the scope of our tutorial. For now, let’s stick to Lead: City is equal to Chicago (4). On the next screen, select the “Add Workflow Action > Select Existing Action” option just below the block for Immediate Workflow Actions. From here, indicate the action type is “Outbound Message” and pick the message we created in the Outbound Message creation step above. Click “Save” and we are done with the Workflow Rule.

Create the Listener Service

We’ve reached the development portion of the tutorial. Because .NET Core does not support the server/hosting side of SOAP services, we’re going to use a .NET 4.6.1 project to complete our integration. The first step is to ensure your wsdl.exe command is in the path of your workstation. If the command “wsdl.exe -?” does not produce a valid response, you will need to track this down.

Once we have everything ready, let’s run the command below. The leads.wsdl should have been captured from the first step where we setup the outbound message: wsdl.exe /serverInterface “c:\path\to\your\leads.wsdl” /out:NotificationServiceInterface.cs

This command will generate the server/hosting side contracts (/serverInterface) for the WSDL service since Salesforce is sending the message to us. Make a note of where your NotificationServiceInterface.cs file is located because we’ll be adding this to our project in a moment. The next step is to create a new “ASP.NET Web Application (.NET Framework)” to indicate we want the full version of .NET. You can create this with the “Empty” template type since we aren’t doing anything else here other than just prototyping something.

Once this is done, add a new folder called Infrastructure, and add your previously generated NotificationServiceInterface.cs file to it. You may need/want to update the namespace for that file to ensure it matches your solution. The last step, will be to “Add New Item” to the Infrastructure folder we created, and specify the type as “Web Service (ASMX)” and name our file MyNotificationListener.asmx. The contents of that file should be as follows:

using System.Web.Services;

namespace Salesforce.OutboundMessageExample.Web.Infrastructure
{
    /// <summary>
    /// Summary description for MyNotificationListener
    /// </summary>
    [WebService(Namespace = "http://tempuri.org/")]
    [WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]
    [System.ComponentModel.ToolboxItem(false)]
    // To allow this Web Service to be called from script, using ASP.NET AJAX, uncomment the following line. 
    // [System.Web.Script.Services.ScriptService]
    public class MyNotificationListener : INotificationBinding
    {
        public notificationsResponse notifications(notifications n)
        {
            notificationsResponse r = new notificationsResponse();
            r.Ack = true;
            return r;
        }
    }
}

Your solution should look like this when completed.

Important Security Considerations

There are some important security changes you should apply when running this application in production, and this is a good point to discuss some of them:

  • Ensure that your service is white-listed to only allow Salesforce IP addresses.
  • Use TLS/SSL endpoints for your services to ensure there are no parties which could intercept your sensitive Salesforce data.
  • An OrganizationId parameter is included in each message. Validate this on each incoming message to a locally stored parameter for your organization.
  • If your application allows (or requires) it, download the Salesforce Client Certificate to validate the message is indeed coming from Salesforce.

Testing Locally with Ngrok

Because Salesforce needs to send a message to your application, and it cannot read localhost as a publicly accessible endpoint, we need to provide Salesforce with an address it can use to post messages to us. This is where Ngrok comes in. Ngrok allows you to create a public URL which will tunnel to your local instance of your application. In the first step, when creating the Outbound Message, we just provided a dummy URL. We’re going to update that now to a valid one.

Start the debug for your application, and make a mental note of the port. On my machine the address is: http://localhost:55475/Infrastructure/MyNotificationListener.asmx. Armed with this information, and a downloaded copy of ngrok which is in my system %path% variable, I will run the command: ngrok http 55475 -host-header=”localhost:55475″  .  You can replace 55475 with the port for your project.

You can then see we’re given an http/https URL to use (1) and also a web proxy we can use to monitor requests/responses (2). If I were to request https://6a920c28.ngrok.io/Infrastructure/MyNotificationListener.asmx the result should look the same as if I had requested the page on localhost:55475. Armed with this information, head back over to the Outbound Message you created in the first step, and update the Endpoint URL parameter with this new public URL you have just created.

Firing the Workflow Rule

Ok, we’re pretty much all set at this point. The last item we need to do is head to the Workflow Rule we created in the second step and click “Activate” on the rule. If we don’t activate the Workflow Rule, we won’t get any messages delivered to our endpoint. Ok! Are you ready? Now comes the exciting part. Go head over to the “Leads” section of Salesforce and edit one of the leads which already exist there (or create a new one). Set any values you like. The only important thing to consider is to ensure that the “City” field is set to “Chicago” per our Workflow Rule. One you save the record, your breakpoint should be hit and you can debug the incoming object to see all of the information you are being provided about the record. Boom!

Viewing Outbound Message Status

Having any trouble? Head over to to the Setup location “Environments > Monitoring > Outbound Messages”. Here you will see a snapshot of messages which are queued for delivery as well as any recent failures. A helpful description will be provided as well. In my case, you can see a 404 not found was provided because I didn’t have ngrok running.

Summary

Outbound Messaging is a great way to receive notifications of changes in your Salesforce platform based around specific criteria. You can be as specific or generic as you’d like. Salesforce will deliver up to 100 notifications at a time to your application. It will also try to re-transmit those messages for up to 24 hours, but will gradually increase the gap duration between retries for each failure up to 2 hours between attempts. I hope you enjoy this feature and are able to utilize it to keep your applications in sync with Salesforce.

 

Getting Started with the Salesforce Streaming API in .NET Core

Introduction

Have you ever wished you could get a notification whenever a record type or subset of records changed in Salesforce? You may wish to be able to be notified of those changes for replicating data or to trigger behavior or an external business system.

Enter the Streaming API from Salesforce. This feature consists of several subfeatures which enable you to just that. Opting in to certain changes is possible with Push Topics. Getting a complete list of changes of everything is possible with Change Data Capture. We’re going to talk about both in this post and how you can integrate your .NET applications with Salesforce to capture those changes in near real-time.

If you want to see a full breakdown of the Stream API components and capabilities, you can do so here. Also, please be sure your account meets the criteria for the Streaming API. If you have a developer sandbox from https://developer.salesforce.com, you should already be set.

Finally, I recommend you download the Postman collection I created for Salesforce here, and also the sample .NET Core application I created for connecting to the Streaming API.

CometD, Bayeux Protocol, and Long Polling

The Streaming API is possible thanks to a clustered web messaging technology known as CometD. This event messaging system is built on top of the Bayeux protocol which transmits messages over HTTP. When you connect your .NET Core application to the Salesforce Streaming API, you are not constantly checking for new messages every so often. You are actually establishing an open (long-polling) connection, and you are waiting/listening for those changes to be pushed out to you.

Below is a diagram that shows the various connections that are made when establishing a connection to the Streaming API. The client first performs a handshake to establish a long polling connection. If successful, it sends through a subscription to the channel you want to subscribe to (more on this below). A channel represents which type of event notifications you wish to receive. Once the channel subscription is open, the long-polling sequence takes over and you maintain this connection listening for changes.

One advantage the Streaming API has over the Outbound Messages feature from Salesforce, is that you can run this from behind a firewall on your network without exposing an HTTP endpoint for Salesforce to push records to. You can read more about CometD, Bayeux Protocol, and Long Polling on the Salesforce documentation site.

Streaming API Limits

Just like with our previous posts on the REST API and on the Bulk API, Salesforce enforces limits around how many events can be captured per 24 hour period with the Streaming API. The limits are in place because Salesforce is a multi-tenant application, and resources are shared among customers. Constraints are considered beneficial because they promote good software design, and when you are developing your applications to connect to the Streaming API, you should be cognizant of the limits in place (which may vary from environment to environment, or customer to customer).

What is Change Data Capture?

Change Data Capture, which is set to formally be deployed in the Spring 19 (tentatively set at Feb. 8, 2019) release, is currently in developer preview.

Below is an example payload you would receive when subscribe to a Change Data Capture channel when one of those entity types changed:


A few points are worth calling out in the payload json which gets sent to us are worth talking briefly about. The first is the entity type (1) which was changed. Change Data Capture allows you to subscribe to certain channels, such as /data/AccountChangeEvent for just Account changes, or in the case of the payload above all changes via /data/ChangeEvents (5). More on subscription channels for Change Data Capture is available here.

The next piece of information we get is the changeType (2) which outlines what type of operation happened. We’re also given just the data which changed (3) so we can limit the size of our payload and know what exactly needs to be updated.

Finally, we’re given a replayId (4) which can be used as a sort of audit trail for this change. Salesforce supports the notion of durable messages which means that these events are saved on the platform, and can be ‘replayed’, for a period of time (1 – 3 days). When you establish your ‘subscription’ in the connection diagram above, you can also provide a replay value. A value of -1 (default) means you want just new messages on this channel. A value of -2 means you want everything available (be aware of your api limits and processing time required for this). A value of X (a prior replayId) means you want everything after a certain event in time you had captured (assuming it falls in the availability window).

I won’t be discussing replays in this post, partly because I’m still working on updating my sample code to support them, but if you want to learn more about them you can see a nice write-up here.

Setting up Change Data Capture Events

Inside your development org, head over to Settings > Integrations > Change Data Capture. Here you will see a screen that looks similar to the screenshot below. Listed are two columns: Available entities which you can subscribe, and Selected entities to which you have already subscribed to. Select a few examples, such as Lead, Account, and Contact. You can also subscribe to changes to custom objects you have created as well.

For each standard entity you selected, you will be able to retrieve those events via /data/<Type>ChangeEvent channel. Custom objects are available via a /data/<CustomObject>_ChangeEvent channel. If you want to subscribe to all changes for all of your selected entities, the /data/ChangeEvents channel will allow you to do that.

Subscribing to Change Data Capture Events

I’ve created a sample project which connects to the Salesforce Streaming API and set it up on Github which you can use as a starting point to begin receiving notifications. You’ll need to be sure to update the appsettings.json file with your relevant values. As an important aside, please be sure to use user secrets for any development and environment variables or azure key vault for any production configuration to ensure you are safely maintaining credentials for your orgs.

Once you’ve updated your settings, you can run the application. I’d encourage you to step through the code as well, which first establishes an OAuth token with your credentials, then performs a channel subscription to the channel you provided, which I’ve defaulted to /data/ChangeEvents to see all changes. If you have trouble connecting, be sure you can connect to the REST API using Postman and then try again.

As you can see in the screenshot above, we’ve managed to successfully connect to our org (1), perform a handshake to establish a long-polling connection (2), and designate our channel subscription (3, 4). Once the application began waiting for changes, we updated a Lead record and changed the company field to “American Banking Corp2” which you see outlined in our change data capture payload (5).

That about wraps it up for Change Data Capture. There is certainly a lot more to explore, but if you are looking to replicate every change to certain record types in your org for replication or other purposes, you can obtain that information using this feature of the Streaming API. There is also a great Trailhead module on Change Data Capture if you want to apply this badge to your Trailhead account and learn more.

What are PushTopics?

Push Topics are a way to receive more granular notifications based on a SOQL query you define. If you’re not familiar with SOQL, it is very similar to SQL but with a few minor differences around how records are joined. You can see a few examples here. Salesforce analyzes record changes against your SOQL query, and based on criteria outlined below, determines if the record should be pushed out to you.

Another parameter to consider when developing your Push Topic SOQL query, is the NotifyForFields value. Your options for this are: All, Referenced (default), Select, and Where. All means you want all field changes pushed when the record matches the WHERE clause of your query. Referenced means you would like only changes to fields in your SELECT and WHERE, for all records which match the WHERE clause. Select indicates you would like only changes to fields in your SELECT statement, for all records that match your WHERE clause. Where is just like the Select behavior, but instead of basing changes on fields in the Select, it monitors changes for fields denoted in the Where clause only.

Setting Up PushTopics

Thankfully, it’s possible to create Push Topic records with the REST API. You can download the latest Postman collection which includes these calls. To create a push topic, we create a new sObject with a name (1), which we will later use to subscribe to our channel at
/topic/Chicago_Leads. We also provide a SOQL query (2) which indicates Name, FirstName, LastName, and Industry in the fields (Id is always required) and a where clause looking for Leads just in Chicago. We also designate a NotifyForFields parameter of Select which means we’ll get push notifications when the FirstName or LastName change (since they are in the Select fields) but only for leads in Chicago.

Once the push topic has been created we get an Id back (4) which we can use for updating this topic (if needed). One point to also mention here, is we also included the Name field in our query. Name is a compound field, and Salesforce has some special rules around notifications for compound fields like name and address when creating push topics.

Subscribing to PushTopics

Updating our appsettings.json file in our demo application to set the channel to “/topic/Chicago_Leads”, and connecting to the channel (1), then in our browser editing a lead to change the Industry, where that lead had the City = ‘Chicago’, will result in a streaming API event being pushed to us (2).

Another way we can test our Push Topics, is to use the Salesforce Workbench utility. After logging in, we navigate to Queries > Streaming Push Topics. Here we can create a new test Push Topic (there is no option to specify the NotifyForFields parameter, so assume Referenced), and we can view the stream of events/messages that are happening.

Salesforce has a nice Trailhead module on Push Topics and the Streaming API if you’d like to learn more about this subject.

Summary

To recap, the Streaming API is a great way to connect an externally hosted application or dataset to the Salesforce system to keep that system up to date with changes that are happening in Salesforce. There are a number of options for pushing just a certain predefined set of records with Push Topics, or getting a fire hose of all the changes with Change Data Capture, the Streaming API enables us as developers to keep systems in sync with each other.

 

Processing Large Amounts of Data in Salesforce with Bulk API 2.0

Running into limits with your Salesforce REST API usage, or simply proactively looking to process a large amount of data in Salesforce? Due to the limits in place on how many calls you can make to the REST API on a per 24-hour period that may limit how much data you can process an alternative approach is needed. Thankfully, with the Bulk API, you can process up to 100 million records per 24-hour period, which should (hopefully) be more than enough for you.

If you need help authenticating with the Salesforce API and getting up and running, or using Postman to manage requests and collections, I’ve written about those both previously. Also, if you would like a way to generate lots of sample data to test with, I’ve written about the Faker JS library which could help with that too. (Note: Faker is available on other platforms also: .NET, Ruby, Python, and Java).

As a final note, you can download the Postman collection I’ve been using for my Salesforce series on Github if you want to follow along that way.

Create the Job

The first step in creating a Bulk API request is to create a job. As part of the job request, you specify the object type, operation (insert, delete, upsert), and a few additional fields. In this example, we’re using the upsert operation paired with the External_Id__c field. If you don’t have have this external id field on your Lead object, you can see my REST API post where I talk about working with external keys which is very important when connecting external business systems. More information about the request body is available here.

Once the response is returned to us, we can see we’re given a job id, as well as some additional information about the job, including the job’s state, which is currently set to Open. Open state means the job is ready to begin accepting data. We also have a Postman “Test” attached to this request, which will save the {{job-id}} parameter which is returned and can then be used in future requests.

As a bonus, you can also view your bulk data jobs in the Salesforce web administration panel. Just head to Setup > Environments > Jobs > Bulk Data Load Jobs’ and you can see your currently open jobs as well as any jobs you completed in the last 7 days.

Checking the Job Status

A common request when working with bulk jobs, is to check the status of the job. We’ll do this now, and see the job is still in the ‘Open’ state, ready to accept data, but in the future when we use this, we may see our job in any of the following states: Open, UploadComplete, InProgress, JobCompleted, Aborted, or Failed.

We’re also given the contentUrl parameter here, which is where we want to post our actual CSV data.

Uploading the Data

Once we are armed with an ‘Open’ job, ready to accept data, we’re ready to start sending data to the job. We do this by sending a PUT request to the contentUrl from our job. In the screenshot below you can see we’ve set our Content-Type to text/csv, and are sending the CSV data in the request body. Here you can match fields for your job’s object type (Lead in this case). If you were sending something like Contact records, you could also specify a field for Account.External_Id__c (assuming you had external id setup on Account) to properly link the contact records to the corresponding Account object.

There are also limits to be aware of when sending your CSV data to the job. The current limit is 150 megabytes of base64 encoded content. If your content is not already base64 encoded, consider limiting your size to 100 megabytes since Salesforce converts your request to base64 upon receipt and this can result in up to a 50% increase in size.

Marking the Job Complete

We briefly mentioned job state earlier when talking about job status. Once you have finished sending the batches to your job, you will need to mark your job completed by setting the status to UploadComplete. If you for some reason didn’t want to finish processing the job and wanted to discard it, you could also set the job state here to Aborted. Once the job is marked as UploadComplete, Salesforce will shortly thereafter mark the status as InProgress, and shortly after that, as JobCompleted or Failed.

Obtaining the Job Results

Salesforce makes two endpoints available to obtain the results of your bulk job’s successful and failed records. Once your job’s status is JobComplete, you can make a request to /jobs/ingest/{{job-id}}/failedResults to obtain the records that failed to process, including an example of why those records were unsuccessful. You’ll always get back a sf__Id and sf__Error in your response so you can handle these in your application appropriately. Similarly, you can also send a request to /jobs/ingest/{{job-id}}/successfulResults to obtain the records which were successfully processed. Inside the success records, you’ll also receive a sf__Id and also a sf__Created property which indicates if the record was created (or modified).

Summary

In this post we discussed the Salesforce recommended way to work with larger datasets. The Salesforce REST API is great for handling transactional records, or even working with up to 25 records at a time with the composite and batch REST endpoints, but for larger recordsets, up to 100mb in size, the preferred way is using the Bulk API. We’ve done a pretty good job (in my apparently not so humble opinion) covering the Bulk API features, but be sure to review the official docs if you want to learn more.

 

Understanding OAuth 2.0 Web Server Flow In Salesforce

Introduction

There are three common ways to authenticate with the Salesforce API. Username/Password flow, User-Agent flow, and Web Server flow. There are subtle but important differences for each of them, so let’s briefly discuss what each of them does:

Username/Password Flow – Works on a secure server, and assumes the developer/application already has a set of API-enabled credentials they are working with.

User-Agent Flow – Works on client apps that reside on a client’s device or browser. This is preferred for Javascript applications where secret storage can not be guaranteed.

Web-Server Flow – Works on a secure server, but does not have credentials for the user available. Assumes application can safely store the app secret.

In this post, we’re going to work on how you can test and develop against Salesforce, using the Web-Server Flow, locally on your machine. As an added bonus, we’ll look at how to make those urls public with ngrok.

Salesforce Setup for Connected App

In your Salesforce developer account, navigate to Settings > Apps > App Manager. Click “New Connected App” in the top-right corner, and provide the following settings. It should look similar to the screenshot below when you are finished.

Connected App Name: OAuth2 Demo
API Name: OAuth2_Demo
Contact Email: <enter_your_email_address_here>
Enable OAuth Settings: checked
Callback URL: https://localhost:5001/salesforce/callback
Selected Oauth Scopes: Access your basic information (id, profile, email, address, phone)

Once you’ve finalized the setup of your connected app, be sure to make note of the ‘Consumer Key’ and ‘Consumer Secret’ which will be used in the sample project.

Sample Project Setup

I’ve posted a sample project on Github that you can download here to follow along. You’ll need to update your appsettings.json file with the client-id (Consumer Key) and client-secret (Consumer Secret) from your connected app you defined earlier, but that should be all that is necessary to run the demo application, even without knowledge of .NET.

After running the application, you’ll see the following in the browser

The link that is generated here comes from SalesforceClient.cs. This client factory takes in your appsettings.json settings and formulates them into a URL that the user is redirected to. Embedded in it the client-id for your application. There are a lot of additional options, such as state data you want passed through, display options, and more that you can set within this link outlined here.

After the user authenticates with Salesforce, they are prompted to allow your application access to the scopes you defined in the connected app. Your app name is also presented to the user. If the user selects the ‘Allow’ button, they will be redirected back to the URL you specified in the ‘redirect-uri’ parameter you specified. The URL will look something like: https://localhost:5001/salesforce/callback?code=aWekysIEeqM9PiThEfm0Cnr6MoLIfwWyRJcqOqHdF8f9INokharAS09ia7UNP6RiVScerfhc4w%3D%3D

The code parameter in the URL is the piece we are most interested in. This authorization code allows us to call the oauth2/token endpoint with the grant_type set to authorization_code. You can see an example of this in the SalesforceClient.cs file as well. If you’ve reached this point, congratulations, you now have an access token to use to make API requests. I’ve written about all the great things you can do with the Salesforce API here.

Bonus: Make a public facing URL with Ngrok

Ngrok is a tunneling application that allows you to forward public facing urls to local urls on your workstation. After downloading it (and adding to your operating system’s path which I’d recommend), run the following command:

ngrok http 5000

This will give you a window similar to the one below. Note how we are using the insecure, 5000 port. If you really want to forward to the SSL endpoint, there is documentation on how to achieve this here: https://ngrok.com/docs#tls-cert-warnings but I won’t be going over this. Suffice it to say, you may run into a 502: Bad Gateway if you do this.

Once you’ve done this, you can update your connected application configuration in Salesforce and replace the https://localhost:5001/salesforce/callback URL with the new ngrok URL you have here (i.e. https://4218e857.ngrok.io/salesforce/callback). You’ll also need to update this in your appsettings.json file for your application. This new URL will forward to port 5000 on your machine that you set on the command line when you ran the ngrok executable.

The bonus of doing this, is that you can share this URL with your customer, or your project manager, to give them a ‘preview’ of how the application will by providing them with a public facing URL. If you want to learn more about ngrok, Twilio has a nice write-up here.

 

Introduction to the Salesforce REST API (using Postman)

This post is going to be a rather lengthy introductory course on the Salesforce REST API. If you’re just looking for the Postman collection, or would like to just follow along, click here. We’ll discuss authentication, basic read operations, SOQL queries, batch & composite queries, and querying with an external key. We’ll also touch on the Salesforce workbench.

In future posts, I’ll discuss creating, updating, and deleting data with the REST API. I’m also planning posts on the Bulk API and Streaming API. If you aren’t already familiar with the Salesforce ecosystem, I’d encourage you to read this post first. Additionally, you’ll need Postman on your machine to get the most benefit from this post.

Salesforce Setup

Let’s get started. If you haven’t done so already, you’ll need to setup a developer account on https://developer.salesforce.com.

Once you’re logged in, from the gear icon in the top-right, navigate to Setup. From the ‘Quick Find’ box on the left, type in “App Manager” and select the menu item with the same name. On this screen, you’ll see a number of pre-built connection points. Let’s add our own by selecting “New Connected App” in the top-right.

Enter the following information and click “Save”:

Connected App Name: Postman Client
API Name: Postman_Client
Contact Email: <enter_your_email_address_here>
Enable OAuth Settings: checked
Callback URL: http://localhost/callback
Selected Oauth Scopes: Access and manage your data (api)

Once this is setup, you’ll be notified that it takes a few minutes for Salesforce to finalize your setup. Make a note of the “Consumer Key” and “Consumer Secret” that are listed on the app screen. The final piece of information you need is a security token. Navigate to the profile settings (see below) and select the Reset My Security Token item and follow along with the instructions to obtain one we’ll use for logging in.

A quick word about security tokens. These are not needed (and may be preferable to not use for better security) if you are on a network with a static IP and can white-list that address inside the ‘Settings > Security > Network access’ menu item. If you have this set you can skip setting the security token for the rest of the article. You could also just use the token now, and then change your approach in production. I am just using this approach to allow everyone to follow along.

Postman Setup

If you haven’t done so already, please import the Postman collection by downloading it here. You can either download and extract the zip, or copy/paste the raw text. Pressing ctrl+O or navigating to File > Import in Postman will let you perform either to import the collection. When you’re done, it should look similar to this:

You’ll also need to setup an ‘Environment’ for Salesforce, which you can see int the top-right of the screenshot above. You’ll need to set the values for your client-id, client-secret, and also your username/password. I strongly encourage storing these values in the Environment instead of in the Postman request so they can be shared securely with others. Below is an example of what your Postman environment for Salesforce should look like when it’s done.

Be sure you’re also creating the fields for instance-url and access-token above even though they are empty. More on this below.

Authentication

Ok, so with the setup ceremony out of the way, let’s start having fun. Below is an example authentication request to Salesforce. Note this is a POST request, sent x-www-form-urlencoded with a set of key/value pairs which are coming from our environment (in the top-right as Salesforce). Also, please observe how the {{password}} and {{security-token}} fields appear right next to each other concatenated.

Something we’re also going to review here is the “Tests” tab of this request. Tests are a fantastic feature in Postman, and they are worth learning more about. For the sake of this post, we are just going to use the Test feature to set our environment variables for {{access-token}} and {{instance-url}}. This is slick, because now we taken our json response, parsed it, and stored pieces of the response into our environment and can reuse that data automatically in our future requests.

API Limits

I think it’s important to make our first ‘real’ request to the Salesforce API as a request to obtain the ‘limits’ for your account. API limits are an important concept in Salesforce as this is a SaaS/multi-tenant environment and you are sharing resources with other users. Note below, we are using our {{access-token}} as stored from our authentication test.

The limits you see here are for a 24-hour period. Our developer account is limited to 15,000 API requests per day, and we have 14,999 remaining, having used our first one to inquire about the limits for our account. A keen eye will also note our “Postman Client” app we defined earlier, has a limit range but nothing is set. Apps can have their own API limit quotas potentially as well, and may be something a Salesforce admin sets for your app.

Limits encourage good software design and should be embraced. You’ll want to consider limits and supporting application design principals to ensure your application can work within those limits. One quick note is if you are looking to modify a large amount of data, take a look at the Bulk API which I will discuss in a future post.

Basic Queries

The format for single-object queries in Salesforce is typically: /services/data/<version>/sobjects/<objectType>/<salesforceId>. An example of this is below. The last part of this query is a Salesforce Id, which is a 15-character case-sensitive key, and will never change, even if the record is deleted, then later undeleted.

For the example below, the record identifier is likely different for your system, so be sure to head to the Sales app inside your Salesforce portal and filter the Accounts screen to view all, and select one from the list. Salesforce always puts the identifier for the current record in the URL, so you can grab one from your instance there. (ex: https://na85.lightning.force.com/lightning/r/Account/0011U000006ee9iQAA/view)

Something you may see in the future, is querying with a custom object or a custom property. Salesforce differentiates standard and custom objects with a “__c” postfix on object types and properties which are not standard. If I had created a custom object called “MyCustomObject” I would query this with “/MyCustomObject__c/” in the URL. We’ll see this again when we talk about querying with external keys later in the post.

Additionally, when querying an object, you can filter the fields which are returned. Do you see how appending the fields querystring parameter allows us to narrow down the resulting record to just the pieces of information we’d like to retrieve? The smaller payload here will result in improved performance over the wire and also faster deserialization time in our applications.

SOQL Queries

Once you’re comfortable with basic querying in Salesforce, a good next step is the SOQL query syntax. SOQL is often used inside the Salesforce ecosystem inside their Apex language programming paradigm, but we can also leverage it here for REST calls as well. As you can see (1) the language is very SQL-like in nature with the one big difference (to me) being joins.

The result set we get back from the query has our results (3) and also a few interesting fields to note (2). The first is the totalSize and done parameters. If the value for done is false, you will also see a top-level property called nextRecordsUrl that will look like: /services/data/v44.0/query/01g2100000PeqVoAAJ-2000/ (yours will be slightly different). You would then take this URL and use it to retrieve records 2001 – 4000 by re-running this request with the new URL and applying your same SOQL query to the end of this nextRecordsUrl parameter.

Querying with an External Key

Most often when you are using the Salesforce REST API, you are doing so to connect it to an external business system which has it’s own record identifiers/keys. Salesforce understands this notion, and allows you to create properties on your objects which are of type “External Id”.

To setup one of these fields, navigation to Setup (1) > Object Manager (2) > Select your relevant record type, Lead in this case (3) > Fields & Relationships (4) and then add a new record of type text (5) as the type of parameter for your key.


On the following screen specify a name for your field, and an API/Field name will be auto-generated for you (1). Note that anything which is a custom field or custom object type in Salesforce will always be referenced with “__c” at the end of it. So in a bit, we’ll see how this field “External_Id” actually maps to “External_Id__c” in the API. Finally, the checkbox at the end (2) indicates that this field is a unique identifier for a record in an external system.

Now that we have our External Id field setup, we can perform queries against it. You can also use this to update records as well which I’ll talk about in the future, including how to reference parent/child records using the external id on each (i.e. Accounts + Contacts). See how instead of using the Salesforce Id in the request, we’ve included the External_Id__c property in the URL and then referenced the record which has a value in this field of 21456.

Batch & Composite Queries

Salesforce offers two ways to execute multiple ‘requests’, which they refer to as subrequests, via a single API request in the form of batch queries & composite queries.

Let’s start with batch queries as shown in the screenshot below. In our payload (1), you’ll see we can provide multiple subrequests in an array. The batch query format supports up to 25 subrequests per API request, and each subrequest executes independently. Each subrequest also counts against our API limit noted earlier. The advantage here is we reduce our round-trip processing time and improve the speed of our applications.

In our response we can see any errors were returned (2) and the status code for each independent subrequest/response (3) along with the payload that would be returned. The batch API can also be used to perform PATCH, POST, and DELETE to modify your Salesforce data, which I’ll discuss in the future.

Composite queries are similar to batch queries, but different in a few key ways. First, the individual component subrequests do not count against your API limits. Second, the subrequests are executed in order.

Also, note how the response object from a prior query (1) can later be referenced in a subsequent query (2) in the chain. Each response from a composite query is wrapped in a body object (3). Similar to the batch API, you can also use this to create, update, and delete records instead of just querying it.

Salesforce Workbench

There is a great utility application available at: https://workbench.developerforce.com/. This portal allows to you do the same operations we described above when prototyping your applications with Postman. Not only do you gain access the REST API in a nice visual interface, but you can also test SOQL queries, subscribe to Push Topics (to be notified when data changes in Salesforce), view standard and custom object metadata, and more.

Below is an example of the “REST Explorer” in the workbench, available by going to Utilities > Rest Explorer after logging in. You can click through each of these endpoints and see everything Salesforce exposes via the API.

Final Thoughts

Salesforce provides developers with a well designed, properly versioned, full-featured API to use to connect your applications to their ecosystem. There are multiple ways to read as well as modify data inside the Salesforce system that allow you to work in a performant and responsible way that allows all users of the platform to maintain similar performance in their requests as well. If you enjoyed this post, I’d love to hear how it helped you. You can find me on Twitter and Github. Also, I’d love if you subscribe to my semi-regular newsletter of fun things I’ve found online.

 

Introduction to Salesforce for Developers

Many people have heard of Salesforce.  Right now, you’re probably thinking of a web portal used by sales people to find new customers, maintain accounts, and track opportunities, and you’d be right to think that.   Salesforce is a great platform for CRM, but it’s so much more than that.

There are plenty of articles out there on the web tracking the meteoric growth of Salesforce as a company.  One reason for this, is that Salesforce is becoming a full-blown platform, allowing developers (or even savvy business users) to create workflows, business processes, and more using an entire ecosystem of tooling. The company champions the philosophy of the fourth industrial revolution, which follows the digital revolution of computers, and is set to unify and connect people, devices, and data in unprecedented ways.

The core of Salesforce’s business is indeed CRM with the primary product being sold as Sales Cloud. You can sign-up for a free sandbox right now if you’d like at https://developer.salesforce.com. Sales Cloud will allow you to manage leads (new prospects, not yet customers), manage accounts (customers) and the contacts working there. There are features around creating business processes, validation rules, reporting, and more to assist in the sales process.

In addition to Sales Cloud, Salesforce offers a number of additional products:

  • Service Cloud – support and case management for your customers.
  • Marketing Cloud – outreach, email marketing, and journey building
  • Commerce Cloud – solutions for both B2C (formerly Demandware) and B2B (formerly CloudCraze) commerce
  • Heroku – cloud based application management, similar to Azure or AWS.
  • Mulesoft – unify and connect all your disparate business systems into a single solution.
  • Community Cloud – page & content builder for forums, portals, and websites.
  • Quip – workplace collaboration (documents, calendars, chat)
  • Trailhead – learn about Salesforce and earn points/badges.

Salesforce is a multi-tenant SaaS application, meaning multiple organizations will share the same instance. Think of this as renting office space in a downtown sky rise instead of building your own office in the suburbs. When you rent the space, the landlord (Salesforce) is taking care of everything for you in terms of water, electricity, security, mail delivery, and likely offers several on-site services such as a dry cleaning. In a similar fashion, Salesforce is providing you with hosting, application monitoring, security, API extensibility, and much more by building your application inside their environment. It’s less for you to worry about as a developer.

You aren’t limited to the core objects of each platform and can create your own data types to extend and enhance the system. You can also create your own siloed set of business functionality as well. Let’s use a contrived example that I want to manage my video game collection on Salesforce, which is about as far from “sales” as one could get.   Inside the web portal, I can create my objects (Games, Publishers, Platforms) specifying the field types, validation rules, any parent-child relationships.  Once my data types are setup, I can enter records right away using the Salesforce UI.  Additionally, my data is available via a REST or SOAP API automatically.  It’s also available for me in the Salesforce mobile app.  I didn’t need to do any programming to setup the basic CRUD behaviors, security, logging, or anything else for these new objects, everything exists there for me.   Sure, I can extend the application (more on that later), but a lot of the tedious boilerplate work is already taken care of for me.

If I want to write applications that work with this data I created, I can do so using a programming language known as Apex, which is syntactically very similar to C# or Java.  I can write database queries inside this programming language to query my data using SOQL (Salesforce Object Query Language) which is very similar in syntax to SQL.  If I want to insert/update/delete data in Salesforce I can use DML (Data Manipulation Language).  I can also search this data using SOSL (Salesforce Object Search Language) which has a similar syntax to Lucene for anyone who is familiar with the popular search framework that powers Elasticsearch/Solr.  Development is either done inside the ‘Developer Console‘ in Salesforce or can be done with an extension for VS Code.

As if the full development environment were not enough to convince you, there is also a full-featured learning portal, Trailhead, which gamifies the learning process, allowing you to earn badges and points for completing tutorials and even allows you to complete hands-on exercise and checks those exercises for you.  There are exercises for every type of user (business user, admin, or developer) with varying degrees of difficulty that allow to get a quick start on the basics, but also continue to learn and grow to become an advanced user.  Once you’re comfortable developing on the platform, Salesforce even offers several certifications to show to prospective clients that you are indeed qualified and capable.

There are a large number of courses available to developers on both Pluralsight and Udemy (and others I’m sure) for anyone who is looking to learn more about Salesforce development.  Salesforce has a large community following of very passionate customers as well.  If you were to look at any job board, you would see Salesforce skills are in high-demand, and growing rapidly along with the company itself.  They also host a number of conferences they host each year, including Dreamforce in San Francisco.

I really hope this post gave you a good high-level overview of what Salesforce offers.   In the future, I plan to dive into more specifics of the Salesforce platform as I continue to learn all it has to offer.   Two key areas I’m really interested in learning more about are Einstein Analytics, Salesforce’s AI/BI intelligence platform, Mulesoft, and Commerce Cloud.