Category: Salesforce.com

  • My new Pluralsight course on the Salesforce.com integration APIs is now live!

    Salesforce continues to eat the world (3 billion transactions today!) as teams want to quickly get data-driven applications up and running with minimal effort. However, for most apps to be truly useful, they need to be connected to other sources of data or business logic. 2016.03.10ps01Salesforce has a super broad set of integration APIs ranging from traditional SOAP all the way to real-time streaming. How do you choose the right one? What are the best use cases for each? Last Fall I did a presentation on this particular topic for a local Salesforce User Group in Utah. I’ve spent the past few months taking that one hour presentation and turning that into a full-fledged Pluralsight training course. Yesterday, the result was released by Pluralsight.

    The course – Using Force.com Integration APIs to Connect Your Applications – is five delightful hours of information about all the core Force.com APIs.

    In addition to messing around with raw Force.com APIs, you also get to muck around with a custom set of Node.js apps that help you see a realistic scenario in action. This “Voter Trax” app calls the REST API, receives real-time Outbound Messages, and other apps help you learn about the Streaming API and Apex callouts.

    I’ve put together seven modules for this course …

    1. Touring the Force.com Integration APIs. Here I explain the value of integrating Salesforce with other systems, help you set up your free developer account, discuss the core Force.com integration patterns, and give an overview of each integration API.
    2. Using the Force.com SOAP APIs to Integrate with Enterprise Apps. While the thought of using SOAP APIs may give you night terrors, the reality is that it’s still popular with plenty of enterprise developers. In this module, we take a look at authenticating users, making SOAP calls, handling faults, using SOAP headers, building custom SOAP services in Force.com, and how to monitor your services. All the demos in this module use Postman to call the SOAP APIs directly.
    3. Creating Lightweight Integrations with the Force.com REST API. Given the choice, many developer now prefer RESTful services over SOAP. Force.com has a comprehensive set of REST APIs that are secured via OAuth. This module shows you how to switch between XML and JSON payloads, how to call the API, making composite calls (that let you batch up requests), and how to build custom REST services. Here we use both Postman and a custom Node.js application to consume the REST endpoints.
    4. Interacting with Bulk Data in Force.com. Not everything requires real-time integration. The Force.com Bulk API is good for inserting lots of records or retrieving large data sets. Here we walk through the Bulk API and see how to create and manage jobs, work with XML or CSV payloads, deal with failures, and how to transform source data to fit the Salesforce schema.
    5. Using Force.com Outbound Messaging for Real-time Push Integrations. Outbound Messaging is one of the most interesting Force.com integration APIs, and would remind you of the many modern apps that use webhooks to send out messages when events occur. Here we see how to create Outbound Messages, how to build compatible listeners, and options for tracking (failed) messages. This module has one of my favorite demos where data changes from Salesforce are sent to a custom Node.js app and plotted on a Google map widget.
    6. Doing Push Integration Anywhere with the Force.com Streaming API. You may love real-time notifications, but can’t put your listeners on the public Internet, as required by Outbound Messaging. The Streaming API uses CometD to push (in reality, pull via long polling) messages to any subscriber to a channel. This module walks through authenticating users, creating PushTopics (the object that holds the query that Salesforce monitors), creating client apps, and building general purpose streaming solutions with Force.com Generic Streaming.
    7. Consuming External Services with Force.com Apex Callouts. Salesforce isn’t just a system that others pull data from. Salesforce itself needs to be able to dip into other systems to retrieve data or execute business logic. In this module, we see how to call out to various external endpoints and consume the XML/JSON that comes back. We use Named Credentials to separate the credentials from the code itself, and even mess around with long-running callouts and asynchronous responses.

    If your company uses Salesforce, you’ll be able to add a lot of value by understanding how to connect Salesforce to your other systems. If you take my new course, I’d love to hear from you!

  • Call your CRM Platform! Using an ASP.NET Web API to Link Twilio and Salesforce.com

    I love mashups. It’s fun to combine technologies in unexpected ways. So when Wade Wegner of Salesforce asked me to participate in a webinar about the new Salesforce Toolkit for .NET, I decided to think of something unique to demonstrate. So, I showed off how to link Twilio – which in an API-driven service for telephony and SMS – with Salesforce.com data. In this scenario, job applicants can call a phone number, enter their tracking ID and hear the current status of their application. The rest of this blog post walks through what I built.

    The Salesforce.com application

    In my developer sandbox, I added a new custom object called Job Application that holds data about applicants, which job they applied to, and the status of the application (e.g. Submitted, In Review, Rejected).

    2014.04.17forcetwilio01

    I then created a bunch of records for job applicants. Here’s an example of one applicant in my system.

    2014.04.17forcetwilio02

    I want to expose a programmatic interface to retrieve “Application Status” that’s an aggregation of multiple objects. To make that happen, I created a custom Apex controller that exposes a REST endpoint. You can see below that I defined a custom class called ApplicationStatus and then a GetStatus operation that inflates and returns that custom object. The RESTful attributes (@RestResource, @HttpGet) make this a service accessible via REST query.

    @RestResource(urlMapping='/ApplicationStatus/*')
    global class CandidateRestService {
    
        global class ApplicationStatus {
    
            String ApplicationId {get; set; }
            String JobName {get; set; }
            String ApplicantName {get; set; }
            String Status {get; set; }
        }
    
        @HttpGet
        global static ApplicationStatus GetStatus(){
    
            //get the context of the request
            RestRequest req = RestContext.request;
            //extract the job application value from the URL
            String appId = req.requestURI.substring(req.requestURI.lastIndexOf('/')+1);
    
            //retrieve the job application
            seroter__Job_Application__c application = [SELECT Id, seroter__Application_Status__c, seroter__Applicant__r.Name, seroter__Job_Opening__r.seroter__Job_Title__c FROM seroter__Job_Application__c WHERE seroter__Application_ID__c = :appId];
    
            //create the application status object using relationship (__r) values
            ApplicationStatus status = new ApplicationStatus();
            status.ApplicationId = appId;
            status.Status = application.seroter__Application_Status__c;
            status.ApplicantName = application.seroter__Applicant__r.Name;
            status.JobName = application.seroter__Job_Opening__r.seroter__Job_Title__c;
    
            return status;
        }
    }
    

    With this in place – and creating an “application” that gave me a consumer key and secret for remote access – I had everything I needed to consume Salesforce.com data.

    The ASP.NET Web API project

    How does Twilio know what to say when you call one of their phone numbers? They have a markup language called TwiML that includes the constructs for handling incoming calls. What I needed was a web service that Twilio could reach and return instructions for what to say to the caller.

    I created an ASP.NET Web API project for this service. I added NuGet packages for DeveloperForce.Force (to get the Force.com Toolkit for .NET) and Twilio.Mvc, Twilio.TwiML, and Twilio. Before slinging the Web API Controller, I added a custom class that helps the Force Toolkit talk to custom REST APIs. This class, CustomServiceHttpClient, copies the base ServiceHttpClient class and changes a single line.

    public async Task<T> HttpGetAsync<T>(string urlSuffix)
            {
                var url = string.Format("{0}/{1}", _instanceUrl, urlSuffix);
    
                var request = new HttpRequestMessage()
                {
                    RequestUri = new Uri(url),
                    Method = HttpMethod.Get
                };
    

    Why did I do this? The class that comes with the Toolkit builds up a particular URL that maps to the standard Salesforce.com REST API. However, custom REST services use a different URL pattern. This custom class just takes in the base URL (returned by the authentication query) and appends a suffix that includes the path to my Apex controller operation.

    I slightly changed the WebApiConfig.cs to add a “type” to the route template. I’ll use this to create a pair of different URIs for Twilio to use. I want one operation that it calls to get initial instructions (/api/status/init) and another to get the actual status resource (/api/status).

    public static class WebApiConfig
        {
            public static void Register(HttpConfiguration config)
            {
                // Web API configuration and services
    
                // Web API routes
                config.MapHttpAttributeRoutes();
    
                config.Routes.MapHttpRoute(
                    name: "DefaultApi",
                    routeTemplate: "api/{controller}/{type}",
                    defaults: new { type = RouteParameter.Optional }
                );
            }
        }
    

    Now comes the new StatusController.cs that handles the REST input. The first operation takes in VoiceRequest object that comes from Twilio and I build up a TwiML response. What’s cool is that Twilio can collect data from the caller. See the “Gather” operation where I instruct Twilio to get 6 digits from the caller, and post to another URI. In this case, it’s a version of this endpoint hosted in Windows Azure. Finally, I forced the Web API to return an XML document instead of sending back JSON (regardless of what comes in the inbound Accept header).

    The second operation retrieves the Salesforce credentials from my configuration file, gets a token from Salesforce (via the Toolkit), issues the query to the custom REST endpoint, and takes the resulting job application detail and injects it into the TwiML response.

    public class StatusController : ApiController
        {
            // GET api/<controller>/init
            public HttpResponseMessage Get(string type, [FromUri]VoiceRequest req)
            {
                //build Twilio response using TwiML generator
                TwilioResponse resp = new TwilioResponse();
                resp.Say("Thanks for calling the status hotline.", new { voice = "woman" });
                //Gather 6 digits and send GET request to endpoint specified in the action
                resp.BeginGather(new { action = "http://twilioforcetoolkit.azurewebsites.net/api/status", method = "GET", numDigits = "6" })
                    .Say("Please enter the job application ID", new { voice = "woman" });
                resp.EndGather();
    
                //be sure to force XML in the response
                return Request.CreateResponse(HttpStatusCode.OK, resp.Element, "text/xml");
    
            }
    
            // GET api/<controller>
            public async Task<HttpResponseMessage> Get([FromUri]VoiceRequest req)
            {
                var from = req.From;
                //get the digits the user typed in
                var nums = req.Digits;
    
                //SFDC lookup
                //grab credentials from configuration file
                string consumerkey = ConfigurationManager.AppSettings["consumerkey"];
                string consumersecret = ConfigurationManager.AppSettings["consumersecret"];
                string username = ConfigurationManager.AppSettings["username"];
                string password = ConfigurationManager.AppSettings["password"];
    
                //create variables for our auth-returned values
                string url, token, version;
                //authenticate the user using Toolkit operations
                var auth = new AuthenticationClient();
    
                //authenticate
                await auth.UsernamePasswordAsync(consumerkey, consumersecret, username, password);
                url = auth.InstanceUrl;
                token = auth.AccessToken;
                version = auth.ApiVersion;
    
                //create custom client that takes custom REST path
                var client = new CustomServiceHttpClient(url, token, new HttpClient());
    
                //reference the numbers provided by the caller
                string jobId = nums;
    
                //send GET request to endpoint
                var status = await client.HttpGetAsync<dynamic>("services/apexrest/seroter/ApplicationStatus/" + jobId);
                //get status result
                JObject statusResult = JObject.Parse(System.Convert.ToString(status));
    
                //create Twilio response
                TwilioResponse resp = new TwilioResponse();
                //tell Twilio what to say to the caller
                resp.Say(string.Format("For job {0}, job status is {1}", statusResult["JobName"], statusResult["Status"]), new { voice = "woman" });
    
                //be sure to force XML in the response
                return Request.CreateResponse(HttpStatusCode.OK, resp.Element, "text/xml");
            }
         }
    

    My Web API service was now ready to go.

    Running the ASP.NET Web API in Windows Azure

    As you can imagine, Twilio can only talk to services exposed to the public internet. For simplicity sake, I jammed this into Windows Azure Web Sites from Visual Studio.

    2014.04.17forcetwilio04

    Once this service was deployed, I hit the two URLs to make sure that it was returning TwiML that Twilio could use. The first request to /api/status/init returned:

    2014.04.17forcetwilio05

    Cool! Let’s see what happens when I call the subsequent service endpoint and provide the application ID in the URL. Notice that the application ID provided returns the corresponding job status.

    2014.04.17forcetwilio06

    So far so good. Last step? Add Twilio to the mix.

    Setup Twilio Phone Number

    First off, I bought a new Twilio number. They make it so damn easy to do!

    2014.04.17forcetwilio07

     

    With the number in place, I just had to tell Twilio what to do when the phone number is called. On the phone number’s settings page, I can set how Twilio should respond to Voice or Messaging input. In both cases, I point to a location that returns a static or dynamic TwiML doc. For this scenario, I pointed to the ASP.NET Web API service and chose the “GET” operation.

    2014.04.17forcetwilio08

    So what happens when I call? Hear the audio below:

    [audio https://seroter.com/wp-content/uploads/2014/07/twiliosalesforce.mp3 |titles=Calling Twilio| initialvolume=30|animation=no]

    One of the other great Twilio features is the analytics. After calling the number, I can instantly see usage trends …

    2014.04.17forcetwilio09

    … and a log of the call itself. Notice that I see the actual TwiML payload processed for the request. That’s pretty awesome for troubleshooting and auditing.

    2014.04.17forcetwilio10

     

    Summary

    In the cloud, it’s often about combining best-of-breed capabilities to deliver innovative solutions that no one technology has. It’s a lot easier to do this when working with such API-friendly systems as Salesforce and Twilio. I’m sure you can imagine all sorts of valuable cases where an SMS or voice call could retrieve (or create) data in a system. Imagine walking a sales rep through a call and collecting all the data from the customer visit and creating an Opportunity record! In this scenario, we saw how to query Salesforce.com (using the Force Toolkit for .NET) from a phone call and return a small bit of data. I hope you enjoyed the walkthrough, and keep an eye out for the recorded webcast where Wade and I explain a host of different scenarios for this Force Toolkit.

  • Co-Presenting a Webinar Next Week on Force.com and .NET

    Salesforce.com is a juggernaut in the software-as-a-service space and continues to sign up a diverse pool of global customers. While Salesforce relies on its own language (Apex) for coding extensions that run within the platform, developers can use any programming framework to integrate with Salesforce.com from external apps. That said, .NET is one of the largest communities in the Salesforce developer ecosystem and they have content specifically targeted at .NET devs.

    A few months back, a Toolkit for .NET was released and I’m participating in a fun webinar next week where we show off a wide range of use cases for it. The Toolkit makes it super easy to interact with the full Force.com platform without having to directly consume the RESTful interface. Wade Wegner – the creator of the Toolkit – will lead the session as we look at why this Toolkit was built, the delivery pipeline for the NuGet package, and a set of examples that show off how to use this in web apps, Windows Store apps, and Windows Phone apps.

    Sign up and see how to take full advantage of this Toolkit when building Salesforce.com integrations!

  • Using the New Salesforce Toolkit for .NET

    Wade Wegner, a friend of the blog and an evangelist for Salesforce.com, just built a new .NET Toolkit for Salesforce developers. This Toolkit is open source and available on GitHub. Basically, it makes it much simpler to securely interact with the Salesforce.com REST API from .NET code. It takes care of “just working” on multiple Windows platforms (Win 7/8, WinPhone), async processing, and wrapping up all the authentication and HTTP stuff needed to call Salesforce.com endpoints. In this post, I’ll do a basic walkthrough of adding the Toolkit to a project and working with a Salesforce.com resource.

    After creating a new .NET project (Console project, in my case) in Visual Studio, all you need to do is reference the NuGet packages that Wade created. Specifically, look for the DeveloperForce.Force package which pulls in the “common” package (that has baseline stuff) as well as the JSON.NET package.

    2014.01.16force01

    First up, add a handful of using statements to reference the libraries we need to use the Toolkit, grab configuration values, and work with dynamic objects.

    using System.Configuration;
    using Salesforce.Common;
    using Salesforce.Force;
    using System.Dynamic;
    

    The Toolkit is written using the async and await model for .NET, so calling this library requires some knowledge of this. To make life simple for this demo, define an operation like this that can be called from the Main entry point.

    static void Main(string[] args)
    {
            Do().Wait();
    }
    
    static async Task Do()
    {
         ...
    }
    

    Let’s fill out the “Do” operation that uses the Toolkit. First, we need to capture our Force.com credentials. The Toolkit supports a handful of viable authentication flows. Let’s use the “username-password” flow. This means we need OAuth/API credentials from Salesforce.com. In the Salesforce.com Setup screens, go to Create, then Apps and create a new application. For a full walkthrough of getting credentials for REST calls, see my article on the DeveloperForce site.

    2014.01.16force02

    With the consumer key and consumer secret in hand, we can now authenticate using the Toolkit. In the code below, I yanked the credentials from the app.config accompanying the application.

    //get credential values
    string consumerkey = ConfigurationManager.AppSettings["consumerkey"];
    string consumersecret = ConfigurationManager.AppSettings["consumersecret"];
    string username = ConfigurationManager.AppSettings["username"];
    string password = ConfigurationManager.AppSettings["password"];
    
    //create auth client to retrieve token
    var auth = new AuthenticationClient();
    
    //get back URL and token
    await auth.UsernamePassword(consumerkey, consumersecret, username, password);
    

    When you call this, you’ll see that the AuthenticationClient now has populated properties for the instance URL and access token. Pull those values out, as we’re going to use them when interacting the the REST API.

    var instanceUrl = auth.InstanceUrl;
    var accessToken = auth.AccessToken;
    var apiVersion = auth.ApiVersion;
    

    Now we’re ready to query Salesforce.com with the Toolkit. In this first instance, create a class that represents the object we’re querying.

    public class Contact
        {
            public string Id { get; set; }
            public string FirstName { get; set; }
            public string LastName { get; set; }
        }
    

    Let’s instantiate the ForceClient object and issue a query. Notice that we pass in a SQL-like syntax when querying the Salesforce.com system. Also, see that the Toolkit handles all the serialization for us!

    var client = new ForceClient(instanceUrl, accessToken, apiVersion);
    
    //Toolkit handles all serialization
    var contacts = await client.Query<Contact>("SELECT Id, LastName From Contact");
    
    //loop through returned contacts
    foreach (Contact c in contacts)
    {
          Console.WriteLine("Contact - " + c.LastName);
    }
    

    My Salesforce.com app has the following three contacts in the system …

    2014.01.16force03

    Calling the Toolkit using the code above results in this …

    2014.01.16force04

    Easy! But does the Toolkit support dynamic objects too? Let’s assume you’re super lazy and don’t want to create classes that represent the Salesforce.com objects. No problem! I can use late binding through the dynamics keyword and get back an object that has whatever fields I requested. See here that I added the “FirstName” to the query and am not passing in a known class type.

    var client = new ForceClient(instanceUrl, accessToken, apiVersion);
    
    var contacts = await client.Query<dynamic>("SELECT Id, FirstName, LastName FROM Contact");
    
    foreach (dynamic c in contacts)
    {
          Console.WriteLine("Contact - " + c.FirstName + " " + c.LastName);
    }
    

    What happens when you run this? You should have all the queried values available as properties.

    2014.01.16force05

    The Toolkit supports more than just “query” scenarios. It also works great for create/update/delete as well. Like before, these operations worked with strongly typed objects or dynamic ones. First, add the code below to create a contact using our known “contact” type.

    Contact c = new Contact() { FirstName = "Golden", LastName = "Tate" };
    
    string recordId = await client.Create("Contact", c);
    
    Console.WriteLine(recordId);
    

    That’s a really simple way to create Salesforce.com records. Want to see another way? You can use the dynamic ExpandoObject to build up an object on the fly and send it in here.

    dynamic c = new ExpandoObject();//
    c.FirstName = "Marshawn";
    c.LastName = "Lynch";
    c.Title = "Chief Beast Mode";
    
    string recordId = await client.Create("Contact", c);
    
    Console.WriteLine(recordId);
    

    After running this, we can see this record in our Salesforce.com database.

    2014.01.16force06

    Summary

    This is super useful and a fantastic way to easily interact with Salesforce.com from .NET code. Wade’s looking for feedback and contributions as he builds this out further. Add issues if you encounter bugs, and issue a pull request if you want to add features like error handling or support for other operations.

  • New Article on Creating and Consuming Custom Salesforce.com Web Services

    I’ve been asked to write a few more articles for the DeveloperForce site (the developer-centric destination for Salesforce.com developers) and the first one is now online. This article, entitled “Working with Custom SOAP and REST Services in .NET Applications” takes a look at how to construct custom SOAP and REST services in Force.com, and then consume them from .NET applications.

    In this longer-than-expected article, I reviewed WHY you create custom services in a product that already has a robust SOAP/REST API, and show you how to build composite services, transaction-friendly services, and more. Consuming these custom services from .NET (or products like BizTalk Server) is easy and I tried to make it simple to follow along.

    Salesforce.com is growing like gangbusters, and the need for qualified integration architects is growing with it. Every time someone stands up a SaaS application, they should be thinking about how to integrate with other cloud or on-premises systems. I’ve been writing all these articles for them because (a) it’s fun, and (b) it’s important to understand all the integration options! Next up, I’ll be looking at mobile notification services (like Windows Azure Notification Hubs) and their Streaming API.

  • Where the heck do I host my … cloud database?

    So far, I’ve looked at options for hosting .NET and Node.js applications in the cloud. But what about the  services that web applications rely on? It’s unlikely that your cloud application will use many on-premises services, so you’ll need things like databases nearby. There are a LOT of relational and NoSQL cloud databases out there. While it’s a perfectly reasonable choice to install and operate a database yourself on someone’s cloud VMs, this assessment looks at “managed” cloud databases. A managed cloud database typically takes care of underlying VM management as well as database tasks like backups.

    I’ve picked out 8 diverse choices (although MANY other interesting services exist), and evaluated them using the following criteria:

    • Type of offering (RDBMS, NoSQL)
    • Technology and versions supported
    • Scalability story
    • High availability options
    • Imposed constraints
    • Pricing plans
    • Administrative access
    • Support material offered

    There are other important factors to consider before actually selecting one of the services below. Make sure to look deeply at the feature set (and lack thereof), SLAs, and data privacy policies.

    Once again, I’m putting these in alphabetical order, which means that Amazon Web Services shows up first, and Windows Azure last. Just like that crafty Jeff Bezos wants.

    Amazon Web Services

    AWS has a variety of database services that offer excellent scale and innovative features.

    Type of Offering Tech and Versions Scalability High Availability
    Relational, NoSQL, and warehouse RDS uses MySQL (5.6.13 and lower), SQL Server (2012, 2008 R2), and Oracle (11.2)DynamoDB is proprietary NoSQL database.

    Redshift is a proprietary data warehouse platform.

    Manually scale RDS instances up and down with minimal downtime.DynamoDB scaling is done by increasing or decreasing the “provisioned throughput”  with impacting availability.

    Redshift scaling occurs by adding or removing nodes in the cluster.

    RDS instances scale up, but do support high availability through “Multi-AZ Deployments” for MySQL or Oracle.DynamoDB is built for high availability by default. Its data is spread across AZs in a region and can withstand server or AZ failure.

    Redshift replicates data across nodes in a (single AZ) cluster and constantly backs up to S3.

     

    Constraints Pricing Admin Access Support
    For RDS, MySQL or Oracle databases can be up to 3TB in size with 30k IOPS. SQL Server databases can be 1TB in size with up to 10k IOPS.DynamoDB supports up to 10k read/write capacity units (unless you receive special permission). Items can only be 64kb in size, but there is no size limit on an entire table.

    Redshift supports 16 XL nodes (2TB apiece) or 16 8XL nodes (16 TB apiece) per cluster.

    RDS pricing includes an hourly charge for the instance, primary storage, Multi-AZ storage, backup storage, and data transfer out.The pricing in DynamoDB is pretty simple. Pay for provisioned throughput units, storage, and data transfer out.

    For Redshift, you pay for capacity per hour, backup storage, and in some cases, data transfer.

    RDS users can create firewall policies that let them use standard client tools for connecting to DB instances.Few admin tasks exist for DynamoDB, but can use AWS Console and API.

    Access Redshift via API, and database/BI tools.

    For RDS, lots of documentation, some tutorials, support forums, and paid support.DynamoDB has documentation, forums, and paid support.

    Redshift is new, but you’ll find good documentation, forums, and paid support.

    Cloudant

    Cool provider of a distributed, cloud-scale JSON document database. Good when you need a high-performing, CouchDB-friendly environment.

    Type of Offering Tech and Versions Scalability High Availability
    NoSQL (document DB) Cloudant developed BigCouch which is a fork of CouchDB. Scaled horizontally by Cloudant. Run as shared (AWS, Azure, Joyent, Rackspace, SoftLayer) or dedicated (AWS, Rackspace, SoftLayer). Supports cross-data center, multiple writable masters.

     

    Constraints Pricing Admin Access Support
    No apparent limits on DB size. For shared hosting, pay for data volume and HTTP requests. Compatible with CouchDB API so admins can use other CouchDB-friendly tools. Most of the admin activities are performed by Cloudant. Some documentation, and 24×7 support.

    Engine Yard

    Long-time PaaS provider offers a handful of different managed databases. One of the rare Riak hosters online so far, Engine Yard is good bet for DB hosting if your app is running in their cloud.

    Type of Offering Tech and Versions Scalability High Availability
    Relational and NoSQL. Relational options include PostgreSQL (9.2.x) and MySQL (5.0.x).For NoSQL, EngineYard offers hosted Riak and supports all possible Riak storage backends.

    EngineYard databases run in AWS.

    Can scale PostgreSQL and MySQL servers up to larger server sizes.Riak is setup in a cluster, and it appears that clusters can be resized. PostgreSQL and MySQL can be set up with read replicas, and replication, but those appear to be only HA options.Riak cluster is set up in an AWS region, and balanced between AZs.

     

    Constraints Pricing Admin Access Support
    PostgreSQL and MySQL databases can be up to 1TB in size (EBS backed).Riak service appears to support up to 1TB per node. Hourly pricing (based on server size), with no extra charge for the database software. Also pay for backups and bandwidth. Access databases from the outside using SSH tunnels and then your preferred management tool. Offer knowledge base, ticketing system, and paid support plans.

    Google

    Google offers a couple different databases for cloud developers. The options differ in maturity, but both offer viable repositories.

    Type of Offering Tech and Versions Scalability High Availability
    Relational and NoSQL. Google Cloud SQL in based on MySQL (5.5).The Google Cloud Datastore is a preview service and came from the Google App Engine High Replication Datastore (BigTable). For Cloud SQL, users can switch between instance sizes to adjust capacity.Cloud Datastore writes scales automatically. Cloud SQL supports either sync or async replication to multiple geographic locations.Cloud Datastore is replicated (in real time) across data centers.

     

    Constraints Pricing Admin Access Support
    For Google Cloud SQL, Maximum request/response size is 16MB. Databases can be up to 100GB in size.The Cloud Datastore has no maximum amount of stored data, up to 200 indexes, and no limit on reads/writes. Google Cloud SQL can be paid for in package (per day) or per-use (hourly) billing plans. Per-use plans include additional per-hour charge for storage. Both plans requirement payment for outbound traffic.For the Cloud Datastore, you pay an hourly per-GB charge, plus a cost per 100k API operations. Use client tools that support a JDBC connection and Google Cloud SQL driver. Also supports a command line tool.Developers use a tool from Google (gcd) to manage the Cloud Datastore. For Google Cloud SQL, you’ll find documentation, discussion forums, and paid support.Support for the Cloud Datastore can be found in communities, documentation, and a free/paid ticketing system.

    NuoDB

    Offers a “newSQL” product which is an object-oriented, peer-to-peer, transactional database. Powerful choice for on-premises or cloud data storage.

    Type of Offering Tech and Versions Scalability High Availability
    Relational. Proprietary, patented technology base. Supports manual scale out of more hosts and can also apparently add capacity to existing hosts. Journaling ensures that writes are committed to disk, and they offer multiple ways to configure the hosts in a highly available (geo-distributed, multi-master) way.

     

    Constraints Pricing Admin Access Support
    Amazon-hosted version has 1TB of storage, although seemingly you could add more.They also list a handful of SQL-related limits for the platform. NuoDB has three editions. The developer edition is free, the Pro version is “pay as you scale”, and the cloud version is based on usage in AWS. See here for a comparison of each. Offer a handful of CLI tools, visual consoles, and integration with 3rd party management tools. NuoDB offers documentation, GitHub samples, and support forums.

    Rackspace

    This leading cloud provider sells their own managed cloud database, and recently acquired another. Good choice for apps running in the Rackspace cloud, or if you need a well-engineering MongoDB environment.

    Type of Offering Tech and Versions Scalability High Availability
    Relational and NoSQL (document) Cloud Databases run MySQL (5.1).ObjectRocket is based on MongoDB. Cloud Databases can be scaled up, but not out.ObjectRocket scales out to more sharded instances. Can happen automatically or manually. The Cloud Database relies on SAN-level replication of data, and not MySQL replication (unsupported).The ObjectRocket “pod” architecture makes it possible to replicate data easily. load balancers are in place, geo-redundancy is available, and backups are built in.

     

    Constraints Pricing Admin Access Support
    Looks like most Cloud Database interactions are through the API, and rate limits are applied. You are also able to have up to 25 instances, at 150GB each.CloudRocket offers unlimited data storage if you have defined shard keys. Contact them if you need more than 200k operations/second. Cloud Databases are charged per hour. Storage is charged at $0.75 per month.ObjectRocket has four different plans where you pay monthly, per-shard. Some Cloud Database admin functions are exposed through their Control Panel (e.g. provision, resize) and others through API (e.e. backup) or client tools (e.g. import). See more on how to access the DB instance itself. Rackspace provides lots of support options for Cloud Databases, including a ticketing system, community, help desk, and managed services.ObjectRocket support is done via email/chat/phone.

    Salesforce.com (Database.com)

    Recently made a standalone product after providing the backend to Salesforce.com for years, Database.com offers a feature-rich, metadata-driven database for cloud apps.

    Type of Offering Tech and Versions Scalability High Availability
    Relational Oracle underneath, but no exposure of direct capabilities. interact solely with Database.com interface. Pod architecture designed to scale up and out automatically based on demand. Geographically distinct data centers and near real-time replication between them.

     

    Constraints Pricing Admin Access Support
    No upper limit on storage. Does impose API limits. Free for 3 users, 100k records, 50k transactions. Pay for users, records, and transactions above that. Manage Database.com via web console, Workbench, SOAP/REST API, and platform SDKs. Offer a dev center, discussion boards, support tickets, and paid support plans.

    Windows Azure

    Microsoft has a set of database options that are similar in scope to what AWS offers. Great fit for shared databases between partners or as a companion to a web app running in Windows Azure.

    Type of Offering Tech and Versions Scalability High Availability
    Relational and NoSQL Windows Azure SQL Database runs SQL Server (2012).Windows Azure Table Storage provides a custom, schema-less repository. SQL Database servers can be scaled up. Can also scale usage out through Federations to shard data.Azure Table data is sharded according to a partition key and can support up to 20k transactions per second. For SQL Databases, backups are taken regularly. At least 3 replicas exist for each database.Azure Tables are replicated three times within a given data center.

     

    Constraints Pricing Admin Access Support
    SQL Databases can be up to 150GB in size. SQL Databases don’t support the full feature set of SQL Server 2012.Azure Table entities can be up to 1MB in size, and tables/accounts can store up to 200TB of data. Pay as you go for SQL Database instances. Different price for reserved capacity. Also pay for bandwidth consumption.Azure Table pricing is rolled up into “Storage” where you pay per GB/hr, and for bandwidth. SQL Databases via REST API, web Management Console, or client tools.Azure Tables can be accessed via REST API (OData) and platform SDKs. Whitepapers, documentation, community forums all free. Also offer paid support plans.

    Summary

    Clearly, there are a ton of choices when considering where to run a database in the cloud. You could choose to run a database yourself on a virtual machine (as all IaaS vendors promote), or move to a managed service where you give up some control, but get back time from offloading management tasks. Most of these services have straightforward web APIs, but do note that migration between each of them isn’t a one-click experience.

    Are there other cloud databases that you like? Add them to the comments below!

  • Using the Windows Azure Service Bus REST API to Send to Topic from Salesforce.com

    In the past, I’ve written and talked about integrating the Windows Azure Service Bus with non-Microsoft platforms like Salesforce.com. I enjoy showing how easy it is to use the Service Bus Relay to connect on-premises services with Salesforce.com. On multiple occasions, I’ve been asked how to do this with Service Bus brokered messaging options (i.e. Topics and Queues) as well. It can be a little tricky as it requires the use of the Windows Azure REST API and there aren’t a ton of public examples of how to do it! So in this blog post, I’ll show you how to send a message to a Service Bus Topic from Salesforce.com. Note that this sequence resembles how you’d do this on ANY platform that can’t use a Windows Azure SDK.

    Creating the Topic and Subscription

    First, I needed a Topic and Subscription to work with. Recall that Topics differ from Queues in that a Topic can have multiple subscribers. Each subscription (which may filter on message properties) has its own listener and gets their own copy of the message. In this fictitious scenario, I wanted users to submit IT support tickets from a page within the Salesforce.com site.

    I could create a Topic in a few ways. First, there’s the Windows Azure portal. Below you can see that I have a Topic called “TicketTopic” and a Subscription called “AllTickets”.

    2013.09.18topic01

    If you’re a Visual Studio developer, you can also use the handy Windows Azure extensions to the Server Explorer window. Notice below that this tool ALSO shows me the filtering rules attached to each Subscription.

    2013.09.18topic02

    With a Topic and Subscription set up, I was ready to create a custom VisualForce page to publish to it.

    Code to Get an ACS Token

    Before I could send a message to a Topic, I needed to get an authentication token from the Windows Azure Access Control Service (ACS). This token goes into the request header and lets Windows Azure determine if I’m allowed to publish to a particular Topic.

    In Salesforce.com, I built a custom VisualForce page with the markup necessary to submit a support ticket. The final page looks like this:

    2013.09.18topic03

    I also created a custom Controller that extended the native Accounts Controller and added an operation to respond to the “Submit Ticket” button event. The first bit of code is responsible for calling ACS and getting back a token that can be included in the subsequent request. Salesforce.com extensions are written in a language called Apex, but it should look familiar to any C# or Java developer.

           Http h= new Http();
           HttpRequest acReq = new HttpRequest();
           HttpRequest sbReq = new HttpRequest();
    
            // define endpoint and encode password
           String acUrl = 'https://seroter-sb.accesscontrol.windows.net/WRAPV0.9/';
           String encodedPW = EncodingUtil.urlEncode(sbUPassword, 'UTF-8');
    
           acReq.setEndpoint(acUrl);
           acReq.setMethod('POST');
           // choose the right credentials and scope
           acReq.setBody('wrap_name=demouser&amp;wrap_password=' + encodedPW + '&amp;wrap_scope=http://seroter.servicebus.windows.net/');
           acReq.setHeader('Content-Type','application/x-www-form-urlencoded');
    
           HttpResponse acRes = h.send(acReq);
           String acResult = acRes.getBody();
    
           // clean up result to get usable token
           String suffixRemoved = acResult.split('&amp;')[0];
           String prefixRemoved = suffixRemoved.split('=')[1];
           String decodedToken = EncodingUtil.urlDecode(prefixRemoved, 'UTF-8');
           String finalToken = 'WRAP access_token=\"' + decodedToken + '\"';
    

    This code block makes an HTTP request to the ACS endpoint and manipulates the response into the token format I needed.

    Code to Send the Message to a Topic

    Now comes the fun stuff. Here’s how you actually send a valid message to a Topic through the REST API. Below is the complete code snippet, and I’ll explain it further in a moment.

          //set endpoint using this scheme: https://&lt;namespace&gt;.servicebus.windows.net/&lt;topic name&gt;/messages
           String sbUrl = 'https://seroter.servicebus.windows.net/demotopic/messages';
           sbReq.setEndpoint(sbUrl);
           sbReq.setMethod('POST');
           // sending a string, and content type doesn't seem to matter here
           sbReq.setHeader('Content-Type', 'text/plain');
           // add the token to the header
           sbReq.setHeader('Authorization', finalToken);
           // set the Brokered Message properties
           sbReq.setHeader('BrokerProperties', '{ \"MessageId\": \"{'+ guid +'}\", \"Label\":\"supportticket\"}');
           // add a custom property that can be used for routing
           sbReq.setHeader('Account', myAcct.Name);
           // add the body; here doing it as a JSON payload
           sbReq.setBody('{ \"Account\": \"'+ myAcct.Name +'\", \"TicketType\": \"'+ TicketType +'\", \"TicketDate\": \"'+ SubmitDate +'\", \"Description\": \"'+ TicketText +'\" }');
    
           HttpResponse sbResult = h.send(sbReq);
    

    So what’s happening here? First, I set the endpoint URL. In this case, I had to follow a particular structure that includes “/messages” at the end. Next, I added the ACS token to the HTTP Authorization header.

    After that, I set the brokered messaging header. This fills up a JSON-formatted BrokerProperties structure that includes any values you needed by the message consumer. Notice here that I included a GUID for the message ID and provided a “label” value that I could access later. Next, I defined a custom header called “Account”. These custom headers get added to the Brokered Message’s “Properties” collection and are used in Subscription filters. In this case, a subscriber could choose to only receive Topic messages related to a particular account.

    Finally, I set the body of the message. I could send any string value here, so I chose a lightweight JSON format that would be easy to convert to a typed object on the receiving end.

    With all that, I was ready to go.

    Receiving From Topic

    To get a message into the Topic, I submitted a support ticket from the VisualForce page.

    2013.09.18topic04

    I immediately switched to the Windows Azure portal to see that a message was now queued up for the Subscription.

    2013.09.18topic05

    How can I retrieve this message? I could use the REST API again, but let’s show how we can mix and match techniques. In this case, I used the Windows Azure SDK for .NET to retrieve and delete a message from the Topic. I also referenced the excellent JSON.NET library to deserialize the JSON object to a .NET object. The tricky part was figuring out the right way to access the message body of the Brokered Message. I wasn’t able to simply pull it out a String value, so I went with a Stream instead. Here’s the complete code block:

               //pull Service Bus connection string from the config file
                string connectionString = ConfigurationManager.AppSettings["Microsoft.ServiceBus.ConnectionString"];
    
                //create a subscriptionclient for interacting with Topic
                SubscriptionClient client = SubscriptionClient.CreateFromConnectionString(connectionString, "tickettopic", "alltickets");
    
                //try and retrieve a message from the Subscription
                BrokeredMessage m = client.Receive();
    
                //if null, don't do anything interesting
                if (null == m)
                {
                    Console.WriteLine("empty");
                }
                else
                {
                    //retrieve and show the Label value of the BrokeredMessage
                    string label = m.Label;
                    Console.WriteLine("Label - " + label);
    
                    //retrieve and show the custom property of the BrokeredMessage
                    string acct = m.Properties["Account"].ToString();
                    Console.WriteLine("Account - " + acct);
    
                    Ticket t;
    
                    //yank the BrokeredMessage body as a Stream
                    using (Stream c = m.GetBody&lt;Stream&gt;())
                    {
                        using (StreamReader sr = new StreamReader(c))
                        {
                            //get a string representation of the stream content
                            string s = sr.ReadToEnd();
    
                            //convert JSON to a typed object (Ticket)
                            t = JsonConvert.DeserializeObject&lt;Ticket&gt;(s);
                            m.Complete();
                        }
                    }
    
                    //show the ticket description
                    Console.WriteLine("Ticket - " + t.Description);
                }
    

    Pretty simple. Receive the message, extract interesting values (like the “Label” and custom properties), and convert the BrokeredMessage body to a typed object that I could work with. When I ran this bit of code, I saw the values we set in Salesforce.com.

    2013.09.18topic06

    Summary

    The Windows Azure Service Bus brokered messaging services provide a great way to connect distributed systems. The store-and-forward capabilities are key when linking systems that span clouds or link the cloud to an on-premises system. While Microsoft provides a whole host of platform-specific SDKs for interacting with the Service Bus, there are platforms that have to use the REST API instead. Hopefully this post gave you some insight into how to use this API to successfully publish to Service Bus Topics from virtually ANY software platform.

  • TechEd North America Session Recap, Recording Link

    Last week I had the pleasure of visiting New Orleans to present at TechEd North America. My session, Patterns of Cloud Integration, was recorded and is now available on Channel9 for everyone to view.

    I made the bold (or “reckless”, depending on your perspective) decision to show off as many technology demos as possible so that attendees could get a broad view of the options available for integrating applications, data, identity, and networks. Being a Microsoft conference, many of my demonstrations highlighted aspects of the Microsoft product portfolio – including one of the first public demos of Windows Azure BizTalk Services – but I also snuck in a few other technologies as well. My demos included:

    1. [Application Integration] BizTalk Server 2013 calls REST-based Salesforce.com endpoint and authenticates with custom WCF behavior. Secondary demo also showed using SignalR to incrementally return the results of multiple calls to Salesforce.com.
    2. [Application Integration] ASP.NET application running in Windows Azure Web Sites using the Windows Azure Service Bus Relay Service to invoke a web service on my laptop.
    3. [Application Integration] App running in Windows Azure Web Sites sending message to Windows Azure BizTalk Services. Message then dropped to one of three queues that was polled by Node.js application running in CloudFoundry.com.
    4. [Application Integration] App running in Windows Azure Web Sites sending message to Windows Azure Service Bus Topic, and polled by both a Node.js application in CloudFoundry.com, and a BizTalk Server 2013 server on-premises.
    5. [Application/Data Integration] ASP.NET application that uses local SQL Server database but changes connection string (only) to instead point to shared database running in Windows Azure.
    6. [Data Integration] Windows Azure SQL Database replicated to on-premises SQL Server database through the use of Windows Azure SQL Data Sync.
    7. [Data Integration] Account list from Salesforce.com copied into on-premises SQL Server database by running ETL job through the Informatica Cloud.
    8. [Identity Integration] Using a single set of credentials to invoke an on-premises web service from a custom VisualForce page in Salesforce.com. Web service exposed via Windows Azure Service Bus Relay.
    9. [Identity Integration] ASP.NET application running in Windows Azure Web Sites that authenticates users stored in Windows Azure Active Directory.
    10. [Identity Integration] Node.js application running in CloudFoundry.com that authenticates users stored in an on-premises Active Directory that’s running Active Directory Federation Services (AD FS).
    11. [Identity Integration] ASP.NET application that authenticates users via trusted web identity providers (Google, Microsoft, Yahoo) through Windows Azure Access Control Service.
    12. [Network Integration] Using new Windows Azure point-to-site VPN to access Windows Azure Virtual Machines that aren’t exposed to the public internet.

    Against all odds, each of these demos worked fine during the presentation. And I somehow finished with 2 minutes to spare. I’m grateful to see that my speaker scores were in the top 10% of the 350+ breakouts, and hope you’ll take some time to watch it. Feedback welcome!

  • Calling Salesforce.com REST and SOAP Endpoints from .NET Code

    A couple months back, the folks at Salesforce.com reached out to me and asked if I’d be interested in helping them beef up their .NET-oriented content. Given that I barely say “no” to anything – and this sounded fun – I took them up on the offer. I ended up contributing three articles that covered: consuming Force.com web services, using Force.com with the Windows Azure Service Bus, and using Force.com with BizTalk Server 2013.  The first article is now on the DeveloperForce wiki and is entitled Consuming Force.com SOAP and REST Web Services from .NET Applications.

    This article covers how to securely use the Enterprise API (strongly-typed, SOAP), Partner API (weakly-typed, SOAP), and REST API. It covers how to authenticate users of each API, and how to issue “query” and “create” commands against each. While I embedded a fair amount of code in the article, it’s always nice to see everything together in context. So, I’ve added my Visual Studio solution to GitHub so that anyone can browse and download the entire solution and quickly try out each scenario.

    Feedback welcome!

  • My New Pluralsight Course – Patterns of Cloud Integration – Is Now Live

    I’ve been hard at work on a new Pluralsight video course and it’s now live and available for viewing. This course, Patterns of Cloud Integration,  takes you through how application and data integration differ when adding cloud endpoints. The course highlights the 4 integration styles/patterns introduced in the excellent Enterprise Integration Patterns book and discusses the considerations, benefits, and challenges of using them with cloud systems. There are five core modules in the course:

    • Integration in the Cloud. An overview of the new challenges of integrating with cloud systems as well as a summary of each of the four integration patterns that are covered in the rest of the course.
    • Remote Procedure Call. Sometimes you need information or business logic stored in an independent system and RPC is still a valid way to get it. Doing this with a cloud system on one (or both!) ends can be a challenge and we cover the technologies and gotchas here.
    • Asynchronous Messaging. Messaging is a fantastic way to do loosely coupled system architecture, but there are still a number of things to consider when doing this with the cloud.
    • Shared Database. If every system has to be consistent at the same time, then using a shared database is the way to go. This can be a challenge at cloud scale, and we review some options.
    • File Transfer. Good old-fashioned file transfers still make sense in many cases. Here I show a new crop of tools that make ETL easy to use!

    Because “the cloud” consists of so many unique and interesting technologies, I was determined to not just focus on the products and services from any one vendor. So, I decided to show off a ton of different technologies including:

    Whew! This represents years of work as I’ve written about or spoken on this topic for a while. It was fun to collect all sorts of tidbits, talk to colleagues, and experiment with technologies in order to create a formal course on the topic. There’s a ton more to talk about besides just what’s in this 4 hour course, but I hope that it sparks discussion and helps us continue to get better at linking systems, regardless of their physical location.