3 Best Practices for Optimizing Wave Analytics Dataflows

For those of you that are already using Wave Analytics in a Production environment, you hopefully took a look at the Wave Data Monitor when scheduling your Dataflow.  If you’re working in a Full Sandbox or Production environment and running the Dataflow, you’re typically dealing with large data volumes.  In those scenarios, you really want to make sure your Dataflow is built correctly, because that is when you can start hitting some longer times to refresh your data.

Wave.png

Optimize!  Only Import Records Once

As of Summer ’17, this will be much easier to do!  Importing your list of Accounts three times into Wave is silly.  Reuse the “Extract Account” node for anything that is referencing the Account.  Don’t extract all of your Accounts more than once.  You can use a recipe to do the filtering (if any) that needs to be done on your Dataset.  As mentioned, with the new Dataflow builder, this is going to be much easier and not require any JSON code.  However, you’ve got to be aware of the potential issue and make sure you put the effort into reusing your different sfdcDigests that are extracting records.

bi_integrate_dataflow_editor_nodes_on_canvas.png

Use Incremental Loading Where Possible [Enable Replication (Winter ’17)]

Why would you want to bring in records that haven’t changed?  This allows you to speed up your Dataflow by reducing the records added in and only finding those recently modified.  By default, after Winter ’17, if you have Replication Enabled you’ll have this enabled.  This allows you to only update records that have changed since your Dataflow last ran.  This is a huge boost for speed.  If you have millions or hundreds of millions of records, you’ll greatly benefit from this feature.

wa_integrate_datamanager_replication.png

Don’t Bring In Every Field

If you add every Account field into your dataset, that is going to add more data that Wave has to grab.  While Wave is extremely fast, if you don’t need it… don’t bring it over!  You’re just slowing down how long your Dataflow takes to run, and adding in extra fields that you don’t need in your Lens.  From personal experience, any Long Area Text fields are the worst.

TLDR: Don’t bring in any extra records, extra fields, and turn on Replication (Incremental Loading).

 

 

 

 

 

 

 

 

 

 

 

 

 

Comparing using Flow Loops to a Data Update using Excel

There are many similarities with a Flow Loop and a manual Data Update using Excel. We’re going to walk through a process and compare throughout how the Excel steps relate to the Flow.  When I say Flow Loop, I’m referring process of using a Fast Lookup, Looping through the records, and then Updating the records in a Fast Update.  (If you’ve got a better way to describe it, let me know!)

The below Flow is reassigning all of our Accounts to my new Sales team.  All Low Priority Accounts will go to Astro.  All Medium Priority Accounts will go to Einstein.  And, any Accounts without a Priority will be marked as High and all High Priority Accounts get assigned to Codey.

Easy enough, so let’s take the below finished Flow and walk through how you’d do everything this Flow does in Excel and Workbench (or another Data Loader).

FlowFinished

Part 1 – Creating the Report & Exporting to Excel  [Fast Lookup]

Alright, so let’s get a Report that includes the Customer Priority, Account Owner, and Account ID.

AccountReport

What this looks like in Flow is:

FastLookup.png

Note – AccountsToReassign is a Collection Variable

Alright, so now let’s export this data out so we can work with it in Excel!
ExportReport1

ExportReport

Part 2 – Modifying the Data  [Loop]

Now that we’ve exported out our Data, it’s time to work with it and make our updates!  In Flow, our version of skipping to the next row is by using our Loop element.  In this scenario, it looks like this:

Loop1

 

We’ve got a Medium Priority Account, that means we need to assign it to Einstein.

UpdateRow1

In Flow, this is what we just did:

Estine.png

We update the OwnerId of the Account to Einstein’s ID.

assign2e

Then, we add this Account to the Collection/List of Records we are going to update at the end of the Flow.

Add to Collection

Let’s go to our second record.  This also should get assigned to Einstein.  Notice, we’ve only touched the first two rows, everything else is untouched.  Also, notice that we’ve actually done NOTHING to Salesforce yet… it’s all prep work at this point.   This is important, so I’m going to say it again — Salesforce doesn’t yet know of the two Account Ownership changes, because we’ve not updated the records… we’re just getting them ready for to update.

UpdateRow2

Once again, in Flow, this is what we just did:

Estine.png

As we keep on Looping through these records, we eventually get to a Low Priority Account, and assign it to Astro.

UpdateRow4

In Flow, this is what we just did:

Astro.png

We update the OwnerId of the Account to Astro’s ID.

assign2a

Then, we add this Account to the Collection/List of Records we are going to update at the end of the Flow.

Add to Collection

As we keep updating each record, “looping” through them, we eventually get to the last record.  In this record, we’ve not got any Priority.  We need to update the Priority AND the Owner to Codey.

UpdateRow11

In Flow, this is what we just did:

Codey.png

Notice, our assignment is slightly different than the last two, because we’re updating the Customer Priority Field as well as the OwnerId here.

Assign Updates

For the last time, we add this Account to the Collection/List of Records we are going to update at the end of the Flow.

Add to Collection

Part 3 – Updating the Data  [Fast Lookup] 

Now, we’ve correctly updated our Excel spreadsheet.  We’re ready to commit these changes into Salesforce.  So, let’s navigate to our Data Loader tool of choice, and get ready to Update the data!

Update1

Map the Fields in the Excel file to the Salesforce Fields

Update2

Next, we take that deep breath in, and hit Confirm Update.

Update3.png

In Flow, all that looks like is this:

Update Accounts

Note – AccountsToUpdate is a Collection Variable

Now that we’ve finished our update, we want to navigate back to our report and admire our craftsmanship:

UpdatedAccounts

Recap: If you’re an Admin that has done any sort of data manipulation in Excel and then updated Salesforce, you are equipped to master Fast Lookups, Loops, and Fast Updates/Creates.

Related Resources:

  1. How to use a Fast Lookup
  2. How to use a Loop
  3. Counting in Loops
  4. How to use a Fast Update

Counting Inside a Loop

I often see many different posts around people not quite sure how to use a Loop in Salesforce.  Loops are often used to count a specific number of records.  This could be for a few different reasons:

  • Watching your Limits
    • Remember, you can’t have more than 2,000 elements accessed in one transaction.  You could use a counter to ensure that you stop before you’d hit that limit.
  • Creating n Records
    • If you want to create a specific number of records, like Tasks.  I’ve seen this request come across on the Success Community and other places many times.
  • Custom Roll-up Summary in Flow
    • If you are’t able to use something like Andy’s DLRS or a standard Roll-up field, you can summarize an Amount or do a record count in Flow.

In this post, I’m going to go over the Watching your Limits scenario.  If you want to have more information around a Custom Roll-up Summary in Flow, it’s actually one of my first posts (here)!  I would also HIGHLY recommend watching Pete Fife’s Automation Hour presentation on Loops: http://automationhour.com/2017/02/pete-fife-deep-dive-into-loopsflow-21717/… he does a fantastic job covering the topic!

Let’s jump on into the details!

Most importantly, we’re going to need to create a variable to track our iterations inside our loop.  So, let’s create that variable.

LoopCounter

 

Now, we’ve got our variable that we will be able to track the number of times we go through our Loop.  I’ve seen this inefficiently added it’s own Assignment inside a Loop.  I’ll urge you to add this into on of your existing Assignments inside a Loop, because it won’t do any harm there, and saves you an element each Loop.

Every time we loop through a record, and we’re doing our Ownership reassignment, we’re adding 1 to the value of our LoopCounter.  If we have 20 records, by the end of the transaction this LoopCounter variable would equal 20.  If we had 10, it would equal 10.

LoopCounterAssignment

 

In Flow, you have a limited number of elements that you can run through in each transaction.  This means, you need to be extra careful when dealing with potentially higher data volumes.  Ideally, you really shouldn’t be hitting close to that 2,000 elements limitation… however, it doesn’t hurt to be paranoid and put in a decision to double-check.  So, that’s what we’re going to do.  The value you use will vary based on how complicated your Loop is.  You need to do some math to see the maximum number of records your Flow can take.  I like to always go below that to be extra safe.

Decision.png

Fun Fact: You don’t have to actually send someone back to the Loop for them to exit it.  You can exit mid-Loop through the records, once you hit a specific number that you want to Loop through.

Decision in Loop.png

And just like that, we were able to count and tell if we were about to hit a limit. While this was all concept, did you catch an area I should have included into this?  An alert to the Admin.  Throw in a Chatter Post or an Email to you for when you hit that limit, so you can review everything is still functioning as you’d expect.

 

Mastering Wave Data Security for Communities with Security Predicate

**UPDATE** You can now, as of Summer ’17, use Salesforce Sharing Inheritance to accomplish this.  If you’re using External Data Sources or need different security, this works great.  https://help.salesforce.com/articleView?id=bi_security_datasets_sharing_getting_started.htm&language=en_US&type=0

I’ve worked on many projects where one of the main goals of the Community implementation is to display Wave Analytics to their Community Users.  If you’re familiar with security inside of Wave, you might have seen some of the common examples available in the documentation.  Unfortunately, these don’t work well when you’re wanting to implement it for a Community.  In addition, the way you implement a Security Predicate has changed since first releasing, and the documentation sometimes varies for how you would go about doing this.  This post is going to walk you through how you can setup a Security Predicate and master your Communities Wave deployment!

wave.jpg

Business Case

We’re implementing a Customer Community that wants to see Case metrics.  You want the Dataset to be dynamic based on the Running User, and allow them to only see their Account’s data.  This would allow us to use the same Dataset and Dashboard for all of our customers.  

That means being able to control the access of the data at the record (row) level.  Just as you currently do with Sharing Rules in your Org as an Admin today.  This would look like:

Admin View

Admin View

Community User View

RowLevelSecurity

If you’re new to Wave Analytics, you might be confused why Wave doesn’t do this natively.  The reason is because Wave Analytics is connected via an Integration User.  That Integration User is typically going to be an Admin Level Read-Only User.  When we access Wave, we’re accessing the dataset from the credentials of our Integration User.  That means, to enforce Row-Level Security, we have to put our Security Predicate inside every Dataset we need to secure.

Alright, let’s get into the how this is done.

To start off, we need to know what our Root is.  In this case, pun intended, we’re going to be using the Case Object as our Root.  The Root is what record you want returned.  Since we’ve already got the Account ID field native to the Case Object, all we need to do is add a custom Text Formula Field called View All Data, and make the value “View All Data”

ViewAllData.png

On the User Object, we need to have a Formula Field that returns the same value when it is an Internal User.  To keep it simple, I’m just granting all Internal Users View All Data privileges.  If you wanted to make this more complicated, you can easily expand on this.

ViewAllDataOnUser.png

Looking into our Dataset’s JSON, you can see we’ve got the AccountId and ViewAllData__c pulled in.  If you don’t bring in the fields you want to reference in your Security Predicate, it won’t work.

Fields in Dataset

Based on everything above, what we’re looking to have as our Security Predicate is:

‘AccountId’ == “$User.AccountId” || ‘ViewAllData__c’ == “$User.ViewAllData__c”

‘AccountId’ == “$User.AccountId” is how we are adding in the Community User’s dynamic filter.  All Cases where the AccountId matches the Community User’s AccountId will be shown.  Keep in mind, you could get creative and use formula fields here if you wanted it to work for an Account Hierarchy instead of just one Account.

|| yes, we can use operators like OR in this, which gives you control to get creative with your sharing.

‘ViewAllData__c’ == “$User.ViewAllData__c” this is how we grant access to all Internal Users.  Because, all of the Internal Users will have “View All Data” populated on their User, and all Cases have “View All Data” filled out as well.

Previously, we’d have to download this JSON, and modify the sfdcRegister (103) and include our row-level security predicate there.  Now, we just need to navigate to our Dataset in Wave.  Note – if you attempt to add your security predicate in it the old way, it won’t work and you’ll be left scratching your head.

Once you’re to your Dataset, select Edit.

Cases Dataset.png

Insert the Security Predicate

Security Predicate.png

Select Update Dataset.  The next time your Dataflow runs it will update with your Security Predicate.  Just like that… you’re all set!  You’ve mastered row-level security in Wave.

Keep your Automation Simple

I’ll start this off by saying I’m guilty of not always keeping it simple.  But, I strive for simplicity.  Just because you can build something in Salesforce 10 different ways, doesn’t mean all 10 of those ways are right.  The solutions we implement often have a great deal to do with the skills and budget your Org has available.  Back when I was a Solo Admin, I was very guilty of duct taping together solutions, because the alternative solution was to have nothing.  We didn’t have the budget to hire developers for all of our ideas.

The purpose of this post is to discuss how we can simplify our solutions to make them easier for us to comprehend and maintain.  We’re going to walk through this set of requirements our project champion gave us:

  1. On Closed Won Opportunities, Alert Accounting with an Email
  2. Automatically Create a new Project for our Account Manager to run.
  3. Update the Account Owner to be the Account Manager
  4. Alert the Account Manager of their new Project, with a link to take them straight to the project.

Let’s take a first pass at solving this…

kiss-option-1

This accomplishes everything that we were looking to do.  We can now send the project champion a note saying that it has been completed… right?  Hold on!

Looking at this solution from end-to-end, how easy is this going to be for me to maintain?  On the surface, it’s pretty basic, but would an outsider easily comprehend it?

Let’s take another pass at simplifying it…

kiss-option-1b

We were able to simplify our automation by putting the Email Alerts into the Process Builder and Flow.  This looks easier to maintain and understand than our first process.  I would be tempted to call it quits here, but I think we could simplify the process further.

So, let’s take one last pass at simplifying this…

kiss-option-2

I’m feeling pretty good about this now.  Everything is in one location, and I can see all of my automation around this scenario in one spot!  Personally, I would go with our third and final solution if I was going to implement this automation.  Don’t go crazy and (when possible) and have Workflow Rules, Process Builder, Flow, and APEX all working together for one piece of automation.

Maximizing Productivity with the Utility Bar

For those of you that are still waiting to get struck by lightning, here’s another feature that you can marvel at until you make the move over (for those of you not using a console).  For me, this is going to be something I will look to start implementing in all of the Org’s I’m working in that are using Lightning.  I was excited about this feature, and already deployed it to my internal Org.  Previously, the Utility Bar only available in a console app, but now it is available for everyone regardless of if you’re in a console app or not.

Here’s a quick look at our Utility Bar in action:

utilitybar

Now, let’s talk about how this works, and why this can maximize productivity for you and End Users.  If you are in Lightning, you can go edit an existing App.  If you’re just getting started with Lightning, you’ll need to navigate to the App Manager and create a new Lightning App.  In the Lightning App wizard you’ll find the Utility Bar as the middle option.

utility-bar

What can we do?  Pretty much whatever we want!  There are 10 standard items already in there for us, but you can also develop your own custom Lightning Components that can be used in the Utility Bar.

add-items-to-utility-bar

Let’s say we want to add a Filter List (List View) onto our Bar… say our Opportunities Closing This Month.  Sure, that might be in the Recently Viewed List Views on the Opportunity Tab, but it isn’t guaranteed to be there.

Under the Utility Item Properties we’re able to customize the Icon, Width, & Height.  You’ll see that we can also have it load when you open the App.  This is is a feature you’ll typically want to run if your doing something like CTI.

Icon

Back when Lightning Experience just came out, you had to download the images from Lightning Design System.  Not anymore, you can now reference the Utility Icons on their site with just the Icon’s name (https://www.lightningdesignsystem.com/icons/#utility).  Be thoughtful in the Icon you select, make sure it makes sense for your Users.

Width & Height

This is one of those things you’ll want to adjust as you test and see the actual size of the component you’re using is on the screen.

utility-bar-ui

Now that we set our size, let’s determine what Object, existing List View (filter), and how many records we want to return.

utility-bar-properities

Simple as that, we’ve added our first Utility Bar Item!!  Let’s do another and put in some of our Key Links that people want to see.  All we need to do is select the Rich Text option and then paste in our Links for the End Users to access.

signpost

After you Save, your Utility Bar is ready for action!

keylinksoncase

How to Optimize your Process Builder and Quick Action Speeds

Process Builder and Actions (Quick Actions) are two of my favorite declarative tools that Salesforce has.  I’ve had a few posts on Actions and how powerful they can be when you add in Process Builder and even sometimes Flow.  I’ve also specifically had a few posts on how to write your Process Builders so that you can optimize them for speed (read here).  Now, this idea on today’s post does go against the concept of having One Process Builder per Object, but I think it does pose a good argument against it.  That argument is SPEED!

I’ve seen scenarios where we want to use Actions to create record(s).  What if you want to create an Action that creates an Opportunity, which in affect you want to have the Opportunity Team dynamically updated, and maybe a few Tasks automatically assigned to the new Opportunity Owner?  This is all stuff that a simple Process on the Opportunity Object will run perfectly, right?  Yes.  However, that automation can bring your Action to a crawl.

The Influence

The inspiration for this post comes directly from some of my work in Apex.  I discovered what Future Methods were when writing some really fun code a while back.  In short, it just attempts to perform the action in the future, as opposed to during your current transaction (creating or updated a record).  This allows you to bypass some limits that you might have otherwise had.

The Solution

By moving our Actions from the Immediate Actions to Scheduled Actions and placing the Set Time to 0 Hours After [date field] to ensure that it fires almost instantly after the record creation or update.

Schedule 0 Hours.png

After setting the time, add in your Actions.

schedule0hours

Advantage

The real advantage here is speed.  By being able to delay certain actions from taking place in the transaction, you’re able to increase your save times.  The result is a faster experience for your End Users.  I don’t think I need to add anything to that 🙂

flash-2.png

Disadvantages & Considerations

I’m aware that the list of disadvantages for the one advantage might comes off like one of those drug commercials with 30 seconds full of side affects… but here we go 🙂

The biggest disadvantage to me is that the record(s) that you might be creating or updating won’t be “ready” instantly.  If the User using the Action is also the one whose expecting some of those Actions to happen automatically, then maybe it isn’t the best solution for them.

As I mentioned at the start of this post, this does go against the One Process Builder per Object design pattern.  If you have a scheduled action, you’re not able to do the “evaluate next” inside your Process.  That post, like this one, is not a mandate that you must follow.  As the Admin/Dev running your Org, you should be able to make an educated decision on what is best for your scenario.

The last consideration around implementing this is understanding its impact on testing.  It is much harder to test something that happens in the future.  This affects your manual testing and any Apex Test Classes.  I would be surprised if you didn’t have any scheduled actions in your Process Builders and Workflow Rules… so it isn’t like this type of action isn’t being used today by many of us.

Putting it all together

The goal of this post is to not say you must use delayed Process Builder actions.  It is simply to add one more tool to your tool belt for when you’re working with Process Builder and Flow.  These concepts are not one-size-fits-all… adjust the concepts to fit your Org.

Assigning Variables to Flows in Lightning Pages

As of Spring ’17, we will have a much easier User Experience for anyone attempting their first Flow in a Lightning Page.  You now are able to pick which variable you want to pass the Record ID into.  Sounds great, right?  Let’s dive right in!

Select the variable to pass the Record ID into

Now with the check of a box, you’re able to select which variable you want to pass the Record ID into.  You can name it whatever you’d like now!  Also, you can pass in a static value into any of the variables, so that you’re able to tell where your Flow might have been used (since you can have a different Lightning Page for different Record Types or Apps.

flowlightningpage

What if you already setup a Flow when this was launched in Winter ’17?  Don’t worry, Salesforce has you covered:

If you created a Text input variable called recordId for the flow component during Winter ’17, the checkbox is automatically selected when your org upgrades to Spring ’17. This default selection happens only when your org upgrades to Spring ’17. If you create a Text variable called recordId in Spring ’17 or later, the checkbox is not selected by default.

Things to consider

#1 – No Merge Fields (for now)

Many of you are probably thinking about passing in a merge field.  You’ll need to hold off on that for the time being.  My guess is it might come in the future, once this is Generally Available and out of BETA.  But, for now, if you try and get clever and use something like {!Record.Name} or {!Account.Name} you’re going to have a bad time!

lightningpageflowerror

#2 – Keep your Inputs Clean

I touched this on my Flowception post (essentially, making your Flows reusable), but you’ll want to make sure that you only make the variables you need to an Input Type.  If you have too many variables marked as Input or Input and Output, you’re going to have a long and cluttered list.  Be thoughtful and only make a variable an Input if it needs to be.

 

 

 

Case, Lead, and Opportunity Age Tracking [Unmanaged Packages]

One of the more popular posts that I’ve had is How to track the Case Age of each Status.  This is one of those posts that is focused on the Case scenario, but the concept applies to many of the Objects in Salesforce.  What I’ve done is create three packages:

  1. Case Status Age
  2. Opportunity Stage Age
  3. Lead Status Age
    • https://login.salesforce.com/packaging/installPackage.apexp?p0=04ti0000000YCdU
    • Installation Notes: 
      1. Add the Lead Status Values to the Status Picklist on the Lead Status Age Object
      2. Update the IsClosed__c field to be TRUE for your Closed Lead Status Values
      3. Verify Permissions.End Users don’t need access to Create/Edit/Delete the Lead Status Age Object.
      4. Add the Related List to your Lead Page Layout
      5. Create Reports & Dashboards

These are all set for you to install.  For a better understanding of how this works, please do read up on the post mentioned above.  If you have any suggestions that would improve these packages, please leave a comment or message me on twitter to let me know.

In addition, I’ve decided to launch a new page to house all of my unmanaged packages that I’ll be making available.  You’ll see another tab at the top of my blog called Packages that will be your go-to for this.  I look forward to adding additional packages that hopefully will provide value to you!

 

4 Easy Ways to Improve Your Org’s Lightning Experience Adoption

#1 Optimize the Home Page

Add a Rich Text Component to create your Home Page Links to key Reports, Dashboards, List Views, External Resources, and other items.  It is all about creating a streamlined process that is enjoyable to the End User.

rich-text-drag

I like to use Bold Text to break up the links into different sections.

richtext

Clean up the page, get rid of excess.  Think about what the User really wants to see when they login to Salesforce.  If Top Deals doesn’t really work well for your Business, then remove it from the Layout.  Drag in your a Report Chart or other components that makes sense for your End Users.

CleanUpHomePage.gif

#2 Utilize Default Tabs

This small trick is one of my favorite parts of Lightning Experience.  I’m now able to set what the default starting point for a page is.  This is fantastic, because going to an Order you’ll typically want to see Order Lines (if you use them).  For a Contact you might care to see the Details first.  With the ability to default the tabs, we’re able to adjust this on every layout!

default-tab

#3 Customize the Highlights Panel

You have the ability to put a few key fields on the top of your page.  By doing this, you can give the End User a constant spot to look at those key Status, Amount, and Owner fields.  Unfortunately we’re limited currently in the number of fields we can display, so be selective of what you do display.  I prefer to use Image Formulas up here to give a quick visual representation of the record, as opposed to plain text, when possible. These fields are managed with Compact Layouts.

highlightpanel

#4 Get Creative with the Layouts

You’re not married to a layout.  You can create your own Layout from one of the templates available.  This ability to control not just the fields, but the whole page’s layout without any code is really awesome, don’t hesitate to use this feature.  You do have to create a New Lightning Page from within Setup.  You can’t hit Edit Page on a record and create a new layout template that way.

newlayout