Finally, a Flow Component in Community Builder!

A peek inside my Winter ’18 pre-release Org and I found one awesome Flow improvement that I’ve been waiting for quite a while.  We now have a Flow Component inside our Community Builder!  There is nothing fancy about this component from the one we’ve got in our internal Lightning App Builder, but the fact that we can now display a Flow to our Customers and Partners without it being in Visualforce is fantastic.

Under the Process Automation Section in Components, you can now see the Flow Component visible for you to drag onto your page.

Flow Component.jpg

Use cases would include providing a guided experience for Case Entry, or a place for Customers to fill out a Survey.  You can make a Flow it’s own Page in your Community and set the page to be visible without a User Logging in by changing the Page Access.

Flow Component 2.jpg

Make Public Flow

Now, we do have some limitations (at this time) around what values we can “pass in”, so you’re not going to be able to pass in any parameters through the URL.  This does limit some of the creativity that we have, but for some of the basic use cases this will do just fine.  You’ll see below that it really is an exact copy of what we have in the Lightning App Builder.

Flow Inputs.jpg

As you can tell, I’m quite excited for this update.  I know that this will allow the use of Flows in Communities on a much larger scale, as now we don’t have to worry about the ugly Visualforce Page UI for the Flow being visible to customers or partners.

How to set the Flow Finish Behavior to a newly created Record WITHOUT an additional Screen

One of the biggest issues that many Flow Users have when they’re looking to build a Visual Flow, is that they can’t redirect the User to the newly created Record without having an intermediary Flow Screen confirmation page.  It adds one pointless click into the mix that we as Admins always try to reduce throughout our Orgs.   What if you didn’t have to use How to pass a new variable out of Flow post to do this?  There might just be another way!  Let’s breakdown the concept of the End Users’ process at a high level…

FlowVFRedirect.jpg

While we are still going to have to use Apex and Visualforce to make this happen, we’re now able to do it without the extra Screen.  This is because, by the time we Redirect our End User to the Visualforce Page, the newly created Record is in the system.  We can then use a really simple SOQL query to find the newly created Record and immediately send them to that record.

Let’s break this down further and talk about the first piece of this process… our Visual Flow and how it can be launched.

The Visual Flow can be launched with either a Visualforce Page or through the URL.  I would typically aim for the URL, if you’re in Classic, as you can take advantage of the new Flow Skin.  For this blog post, that’s the method we’ll be using, and we’ll be doing it through a Button.

In this scenario, to keep it simple, we’re going to have the Flow create a Contact and then redirect us to that Contact record.  So, we need to create a Text Field called Unique Flow Identifier on the Contact for us to pass in as a variable to the newly created Record.

Unique_Flow_Identifier.jpg

We’ll cover what goes into this Field shortly, but let’s take a look at our Flow first.

Create Contact Flow.jpg

We’ll need to create our AccountId and UniqueId Variables, and ensure they are listed as Input variables.

AccountId

Unique_Id

Notice how we are passing in our Unique Identifier into the Contact on creation.  This is how we will query to find it.

Record Create

Our Flow is all set, so we can Save and Activate it.

Now, we’ll get to the Visualforce Page and Apex in a moment, but first let’s assume that we’ve already created our Visualforce Page called FlowRedirect so we can talk about our Button’s URL.

/flow/Create_Contact?AccountId={!Account.Id}&UniqueId={!$System.OriginDateTime}{!Account.Id}
&retURL=/apex/FlowRedirect?Id={!$System.OriginDateTime}{!Account.Id}

In this, I am using the Account ID and System Origin Date Time to combine together and be my Unique Identifier.  You could take this a step further and also use the User ID or another parameter.  If you plan to re-use the same Visualforce Page for multiple Objects, you’ll want to pass in another parameter to let you know what Object you want to query.

Alright… let’s create our Apex Controller.

FlowRedirectController.jpg

public with sharing class FlowRedirectController {public with sharing class FlowRedirectController {
public Object FlowRedirectController() { String unique_id = ApexPages.currentPage().getParameters().get(‘id’);
if(unique_id == null){
// Return Home if no ID String url = ‘/home/home.jsp’; return new PageReference(url); } // Get Contact ID and set Redirect String contactId = [SELECT Name,  Unique_Flow_Identifier__c, Id  FROM Contact  WHERE Unique_Flow_Identifier__c = :unique_id ORDER BY CreatedDate DESC LIMIT 1].Id;
// Did we find a Contact? if (contactId == null) {
// Return Home if no ID String url = ‘/home/home.jsp’; return new PageReference(url); }
// Redirect to Contact String url = ‘/’ + contactId; return new PageReference(url); }}

Our Apex Controller is grabbing the id that we send in through our Button, and then doing a quick Query on the Contacts in our system to find the match.

Now for our Visualforce Page, which is going to be bear bones, because the End User shouldn’t hopefully ever seen it.

Visualforce Page.jpg

<apex:page controller=”FlowRedirectController” action=”{!FlowRedirectController}”>
</apex:page>

Our Visualforce Page is doing an Action of calling our FlowRedirectController on the opening of the page.  Resulting in an immediate redirect to the newly created record.

Let’s watch it in action!

2017-07-23_16-51-42-1

And just like that we have bypassed the “must have a screen” feature in Flow for a redirect.  I’ll be looking to get something on GitHub in the near future to clean this process up further, and provide some test coverage.

 

Get Latitude and Longitude on Custom Address Fields in Salesforce (US Addresses Only)

This is a fun little way to get Latitude and Longitude for US Addresses on any Object in Salesforce.  Not everyone has the skills or budget to do a Google API callout and get the Latitude and Longitude for any Object, but this is something that can be solved pretty easily for anyone not dealing with foreign addresses.

First, we need to turn on Workflow Rules for the Geocoding from Data.com.  Go to Setup, and type in Data Integration Rules

Data Integration Rules.jpg

Click on the Geocodes for Account Billing Address.  You don’t have to do this on the Account.  You can use Lead or Contact instead.  I would personally suggest that you use whichever of the Objects has the least amount of automation and/or usage volume around it.  If you don’t use Leads at your company, then I would substitute everything “Account” for “Lead”.

Account GeoCode

Select Edit Rule Settings.

Edit Rule Settings.jpg

Uncheck the Bypass workflow rules.  Salesforce is doing an asynchronous callout to Data.com Geo to get the Latitude and Longitude, and that response from Data.com Geo is not going to happen in the transaction of your Save.  It happens quickly after, but it is in a different transaction.  You’ll want to ensure you don’t have any recursive issues that might arise where a Workflow fires twice, so test it thoroughly!

Uncheck bypass WF

Alright, so we’ve got Data.com Geo all set to go.  Now, we need to create a Lookup Field to our Custom Object on the Account.  Make it clear what this field is for, and write a description.  Keep the FLS minimal, as nobody will need to see this field but the Admin.

Lat Long.jpg

We also need to create a Geolocation field, if you haven’t already, on the Project Object.  This will be where you store the Latitude and Longitude values.

Geolocation Field.jpg

Let’s navigate to setup a new Process Builder.

Project PB.jpg

Set Project as the Object this Process Builder runs on.

Set on Project

Now, let’s set our Criteria.  You want to ensure that this fires when you want it to.  I’m accounting for an Address change in my criteria.

Criteria.jpg

Time to setup our Immediate Action of Record Creation (of an Account).

Set Field Values.jpg

Activate and we’re ready to move to the next item.  Creating a Flow that will delete the newly created Account and Update the Project.  Then we’re going to create a new Process Builder to launch a Flow.

The first element of our Flow will be a Record Delete, to Delete the Account.  Drag the Record Delete element out.

DragDelete.jpg

In the Record Delete we’re going to need the AccountId variable that we will pass from our Process Builder to be made as Input Only, allowing that to happen.

AccountId var

Ensure we the Record Delete is setup correctly and hit Save.

DeleteAccount

Now, let’s set our Record Delete as the Start Element of our Flow.

Delete as Start

We need to update our Project, so we will drag out the Record Update element.

Record Update drag

We’ve got a few more variables we need to create.  First, let’s create ProjectId and ensure it is marked as Input Only.

ProjectId

Second, we need to create Latitude and mark it as a Number with a Scale of 8 and Input Only.

Latitude

The last variable we need to create is Longitude and mark it as a Number with a Scale of 8 and Input Only.

Longitude

Let’s fill out the Record Update using these variables.

Record Update

Hit Save and connect the Elements together.

Connect the Dots

Save your Flow.

Save the Flow

Activate your Flow.

Activate Flow

Great, now we’re ready for the last step.  Creating our Process Builder to Launch the Flow.  Navigate back over to create a new Process Builder.

Account PB

Select Account as the Object.

Account PB 2

Now, let’s set our Criteria.  We’re going to ensure that this only fires when the Project field is filled out, so this doesn’t accidentally fire at random.

Account Criteria.jpg

Time to map our Fields inside our Flow Immediate Action.

Flowvars.jpg

Save the Immediate Action, and Activate your Process Builder.

Congrats!  You now have Latitude and Longitude being populated on your Custom Object for all US Addresses.

 

 

Introduction to Dynamic Record Choices in Flow

Dynamic Record Choices are one of the coolest features in Visual Flow that often gets overlooked.  In this post we’ll be going over a brief introduction of what and how Dynamic Record Choices might be used.  I’ll be following this post up with an in-depth dive of how you use these in different scenarios.  So, what is a Dynamic Record Choice?

A Dynamic Record Choice is a query on an Object in Salesforce that you may add filters to, and have those results presented to your End User for selection.  The records that you return will be dynamic based on your filters, and the End User can select one or multiple records to perform an action on.

I like to think of it as a Report or a List View.  You select your Object, add your Filters, and then let the User select the record(s) they want to work.

DRC

Let’s say your Sales team that wants Tasks to be entered with minimal fields and effort.  Sales Users are annoyed because the Contact lookup on Tasks is not filtering to just Contacts on the Account record they’re on.  Dynamic Record Choice would come in with the Contact selection, as you can provide that list of Contacts and allow them to have a short-list of those Contacts.  Here is a visual comparison between Standard functionality and Dynamic Record Choice:

DRC compare.png

As you can tell, the End User Experience is going to be easier with the Dynamic Record Choice.  With Standard you’ve got to search, and with Dynamic Record Choice you’re able to present a dropdown (or multi-select picklist/checkboxes).  You can grab variables for filtering from anywhere in Salesforce very easily, since you’ll be using Flow.  It also allows you to be flexible and select multiple records at once and perform the same action on all of them.

Let’s talk about some negatives here… #1 is we have very Limited Labels.  Unfortunately, the only option we have is to concatenate a group of fields together to give more “information” about the results.  If many “columns” of information are needed, then Dynamic Record Choice might not be the right solution for you.

The second negative is something that is a Flow limitation.  Dynamic Record Choices have User Inputted Filters.  Some complicated filters simply aren’t possible in Flow.  To add, if you do want a filter from an End User, it needs to be on another screen.  Sometimes the clicks can make the End User experience negative.

Recap: Dynamic Record Choices rock and you should look at how they might streamline your End User’s lives.  They’re not like duck tape, because they can’t fix every problem, but they’re pretty awesome.  Dealing with a Dynamic Record Choice can be difficult the first few times, because there are a few moving parts to it, so be patient and make sure you’re testing thoroughly that the variables are all being passed through.

Tips for Successfully Deploying Wave Analytics

As much as I love Wave Analytics, it is not very fun to deploy from Sandbox (at this time).  There are many different things to keep in mind when moving Wave from one environment to another (even if it is just Sandbox to Sandbox).  Knowing these shortcomings ahead of your pending deployment will save you a big headache, as you can plan accordingly.  There is a good bit to discuss, so let’s jump right in!

Salesforce-Analytics-Cloud-cropped

Apps

These are nice and easy to deploy, but you have to be careful of the username naming convention you’re using, if you have any specific sharing to Users on your App.  This can be a problem if you’re a consultant and you’re setup in a Sandbox separately from the Production environment, and possibly the naming convention for your Username was switched up.  Or, it could be from you assigning the App to a specific Community User and that Community User was only created in your Source Org and not your Target Org.  Either scenario will cause an error.

WaveError.jpg

Dataflow

This is something that you need to be careful with.  The Dataflow that you build in your Source Org will overwrite the existing Dataflow in your Target Org.  This means, if you have anything in your Dataflow (if it already exists) in the Target Org, you need to make sure you still have it in the Source Org’s Dataflow, or it will be deleted.

The key with Dataflows is to immediately run them to get all of your Datasets populated as quickly as possible.  Make sure you’ve got all the fields in your Target Org that the Dataflow references, and that the Analytics Cloud Integration User has FLS.  You can deploy a Dataflow without meeting those requirements, and it will error when you run it if you forget.

Datasets

I’m going to go out and just say that you shouldn’t deploy any of your Datasets.  Let your Dataflow create them upon running the first time.  You run the risk of having the Dataset Name adjusted if you do anything out of sequence.  And, this can cause additional issues when you’re deploying complex Dashboards (as I’ll touch on in more detail shortly).

Dashboards & Lenses

Dashboards and Lenses are annoyingly close to complete.  These will show you an error when you first open them, because the datasets they’re targeting are empty.

Dashboard Error.jpg

What you have to do is make sure your Dataflow has successfully populated the new Datasets, and then go into the JSON of your Dashboard or Lens and make the adjustments.  Note on the above error, it will make you press Continue for every Dataset in your Dashboard… so don’t think it’s broke if you have 10 Datasets, you just have to click 10 times.

The first one, you’ve heard it a million times… don’t hard-code IDs in Salesforce!!  Well, in Wave, you’ve got no other choice.  When you deploy a new Dataset, your Target Org will have a new Dataset ID.  Your Dashboards and Lenses are going to be still referencing the older Dataset ID, and you need to go in and do a “Find and Replace” for the Dataset ID.  This can be pretty easy if you’ve got a simple Dashboard with one dataset, but once you get into double-digit, you run into potentially some of the other areas of trouble…

The Dataset Name also is used inside the Connector aka dataSourceLinks (how you link Datasets together so they dynamically filter) and in any PIGQL.  So, if you deployed a Dataset incorrectly and the name changes, you’re going to have to also update the Name in these spots similar to how you did with the ID.

Recipes

Recipes became GA in Spring ’17.  There extremely powerful and are getting even stronger with the Summer ’17 release.  In the Summer ’17 release they’re becoming accessible through the REST API.  What they allow you to do is to filter and transform an existing dataset extremely easily.  You can do joins to other datasets, bucketing of fields, adding of filters, and more.  The issue here is, they don’t live anywhere in the metadata (at this time).  So, whatever you create in a recipe, you’re going to have to manually re-create in your Target Org.  At the rate they’re improving all aspects of Wave, I am hopeful this becomes deployable with the Winter ’18 release.

Security Predicates

Unfortunately, Security Predicates don’t live anywhere (at this time) to where you can successfully deploy them.  Luckily these are typically straightforward, meaning that it’s typically a quick copy & paste to get your Security Predicate moved into the new environment.  When you’re adding a Security Predicate into your new environment, you need to make sure you meet these basic requirements:

  1. Analytics Security User has READ Access to all referenced Fields
  2. The Running User (you) has READ Access to all referenced Fields

In short, make sure you correctly deployed the FLS for the Fields that you had in your previous environment, or you’ll be running around in circles.

RECAP

Sometimes I wonder why I bother to deploy this at all.  It would be (at the time of this post) almost just as much effort to simply copy & paste the work over to the new environment.  Be careful of the known shortcomings and plan accordingly.  Because of these issues, depending on the size of your deployment, you need to be aware of the additional time it will take to deploy.

3 Best Practices for Optimizing Wave Analytics Dataflows

For those of you that are already using Wave Analytics in a Production environment, you hopefully took a look at the Wave Data Monitor when scheduling your Dataflow.  If you’re working in a Full Sandbox or Production environment and running the Dataflow, you’re typically dealing with large data volumes.  In those scenarios, you really want to make sure your Dataflow is built correctly, because that is when you can start hitting some longer times to refresh your data.

Wave.png

Optimize!  Only Import Records Once

As of Summer ’17, this will be much easier to do!  Importing your list of Accounts three times into Wave is silly.  Reuse the “Extract Account” node for anything that is referencing the Account.  Don’t extract all of your Accounts more than once.  You can use a recipe to do the filtering (if any) that needs to be done on your Dataset.  As mentioned, with the new Dataflow builder, this is going to be much easier and not require any JSON code.  However, you’ve got to be aware of the potential issue and make sure you put the effort into reusing your different sfdcDigests that are extracting records.

bi_integrate_dataflow_editor_nodes_on_canvas.png

Use Incremental Loading Where Possible [Enable Replication (Winter ’17)]

Why would you want to bring in records that haven’t changed?  This allows you to speed up your Dataflow by reducing the records added in and only finding those recently modified.  By default, after Winter ’17, if you have Replication Enabled you’ll have this enabled.  This allows you to only update records that have changed since your Dataflow last ran.  This is a huge boost for speed.  If you have millions or hundreds of millions of records, you’ll greatly benefit from this feature.

wa_integrate_datamanager_replication.png

Don’t Bring In Every Field

If you add every Account field into your dataset, that is going to add more data that Wave has to grab.  While Wave is extremely fast, if you don’t need it… don’t bring it over!  You’re just slowing down how long your Dataflow takes to run, and adding in extra fields that you don’t need in your Lens.  From personal experience, any Long Area Text fields are the worst.

TLDR: Don’t bring in any extra records, extra fields, and turn on Replication (Incremental Loading).

 

 

 

 

 

 

 

 

 

 

 

 

 

Comparing using Flow Loops to a Data Update using Excel

There are many similarities with a Flow Loop and a manual Data Update using Excel. We’re going to walk through a process and compare throughout how the Excel steps relate to the Flow.  When I say Flow Loop, I’m referring process of using a Fast Lookup, Looping through the records, and then Updating the records in a Fast Update.  (If you’ve got a better way to describe it, let me know!)

The below Flow is reassigning all of our Accounts to my new Sales team.  All Low Priority Accounts will go to Astro.  All Medium Priority Accounts will go to Einstein.  And, any Accounts without a Priority will be marked as High and all High Priority Accounts get assigned to Codey.

Easy enough, so let’s take the below finished Flow and walk through how you’d do everything this Flow does in Excel and Workbench (or another Data Loader).

FlowFinished

Part 1 – Creating the Report & Exporting to Excel  [Fast Lookup]

Alright, so let’s get a Report that includes the Customer Priority, Account Owner, and Account ID.

AccountReport

What this looks like in Flow is:

FastLookup.png

Note – AccountsToReassign is a Collection Variable

Alright, so now let’s export this data out so we can work with it in Excel!
ExportReport1

ExportReport

Part 2 – Modifying the Data  [Loop]

Now that we’ve exported out our Data, it’s time to work with it and make our updates!  In Flow, our version of skipping to the next row is by using our Loop element.  In this scenario, it looks like this:

Loop1

 

We’ve got a Medium Priority Account, that means we need to assign it to Einstein.

UpdateRow1

In Flow, this is what we just did:

Estine.png

We update the OwnerId of the Account to Einstein’s ID.

assign2e

Then, we add this Account to the Collection/List of Records we are going to update at the end of the Flow.

Add to Collection

Let’s go to our second record.  This also should get assigned to Einstein.  Notice, we’ve only touched the first two rows, everything else is untouched.  Also, notice that we’ve actually done NOTHING to Salesforce yet… it’s all prep work at this point.   This is important, so I’m going to say it again — Salesforce doesn’t yet know of the two Account Ownership changes, because we’ve not updated the records… we’re just getting them ready for to update.

UpdateRow2

Once again, in Flow, this is what we just did:

Estine.png

As we keep on Looping through these records, we eventually get to a Low Priority Account, and assign it to Astro.

UpdateRow4

In Flow, this is what we just did:

Astro.png

We update the OwnerId of the Account to Astro’s ID.

assign2a

Then, we add this Account to the Collection/List of Records we are going to update at the end of the Flow.

Add to Collection

As we keep updating each record, “looping” through them, we eventually get to the last record.  In this record, we’ve not got any Priority.  We need to update the Priority AND the Owner to Codey.

UpdateRow11

In Flow, this is what we just did:

Codey.png

Notice, our assignment is slightly different than the last two, because we’re updating the Customer Priority Field as well as the OwnerId here.

Assign Updates

For the last time, we add this Account to the Collection/List of Records we are going to update at the end of the Flow.

Add to Collection

Part 3 – Updating the Data  [Fast Lookup] 

Now, we’ve correctly updated our Excel spreadsheet.  We’re ready to commit these changes into Salesforce.  So, let’s navigate to our Data Loader tool of choice, and get ready to Update the data!

Update1

Map the Fields in the Excel file to the Salesforce Fields

Update2

Next, we take that deep breath in, and hit Confirm Update.

Update3.png

In Flow, all that looks like is this:

Update Accounts

Note – AccountsToUpdate is a Collection Variable

Now that we’ve finished our update, we want to navigate back to our report and admire our craftsmanship:

UpdatedAccounts

Recap: If you’re an Admin that has done any sort of data manipulation in Excel and then updated Salesforce, you are equipped to master Fast Lookups, Loops, and Fast Updates/Creates.

Related Resources:

  1. How to use a Fast Lookup
  2. How to use a Loop
  3. Counting in Loops
  4. How to use a Fast Update