Tag

Slider

Browsing

In this article, we will try to understand the basics of AmpScript and insert records into Sales cloud. In our previous article, we learned how to personalize using Ampscript. Ampscript code starts with %% and ends with %%. For example, if we wanted to get the dynamic value of Email from Data extension we can use %%Email%% on the Email template or on the cloud page. In case of Cloud page if we want to write an ampScript block then it starts with %%[ and ends with ]%%.

%%[  

/** Start of Variable Declarations **/

var @newFirstName,@newLastName,@newEmail,@newPhone ,@website,@Submit

/** End of Variable Declarations **/

/** Start of Setting Dynamic/static values to desired variables **/

set @newFirstName = RequestParameter(“Firstname”)

set @newLastName = RequestParameter(“Lastname”)

set @newEmail = RequestParameter(“email”)

set @newPhone = RequestParameter(“phone”)

set @website = ‘Home Page’

set @Submit = RequestParameter(‘submit’)

/** End of Setting Dynamic/static values to desired variables **/

/** Validate if the variable has got the desired value before inserting into Data extension **/

If @submit == ‘Submit’ Then

CreateSalesforceObject(“Contact”,3,”Email”, @emailAddress, “FirstName”, @newFirstName,”LastName”, @newLastName,”Phone”,@newPhone)

Endif

]%%

<label>Name:</label> <input name=”Firstname” type=”text” value=”%%FirstName%%” />

<label>Name:</label> <input name=”Lastname” type=”text” value=”%%LastName%%” />

<label>Email:</label> <input name=”email” type=”text” value=”%%email%%” />

<label>Phone:</label> <input name=”phone” type=”number” value=”%%phone%%” />

<input name=”submit” type=”submit” value=”Submit” />

In the above code snippet, we have started our ampscript code with %%[ and then we declared variables using var keyword. Every variable must be declared/starts with @. We have declared newName, newEmail, newPhone for demo purposes.

var @newFirstName,@newLastName,@newEmail,@newPhone,@website,@Submit

After declaring the variables the next step is to populate the variables with some values. We can set any value to the variable as it can hold text or numbers. In our scenario, we want to collect all the dynamic values such as name, email, phone and submit from the form which the customer has entered.  RequestParameter function in Ampscript helps in getting dynamic values from the form.

set @newFirstName = RequestParameter(“Firstname”)

set @newLastName = RequestParameter(“Lastname”)

set @newEmail = RequestParameter(“email”)

set @newPhone = RequestParameter(“phone”)

set @website = ‘Home Page’

set @Submit = RequestParameter(‘submit’)

After the Declaration of variables and setting/getting the values into variables, our next step is to use the variables and send them to Sales cloud/Service Cloud.

We always have to do conditional based logic to avoid unnecessary calls/errors to data extensions. To implement conditional logic we can use If condition/If else condition similar to other programming/Scripting languages. The syntax is a little different when compared to other programming/Scripting languages.

If (condition is true) Then

do this

Endif

In our scenario, we are checking if @submit variable is populated with value as submit then insert into Sales cloud/Service cloud.

If @submit == ‘Submit’ Then

CreateSalesforceObject(“Contact”,3,”Email”, @emailAddress, “FirstName”, @newFirstName,”LastName”, @newLastName,”Phone”,@newPhone)

Endif

 Conditional Rendering in Lightning Web Components

In this post, We will try to understand how to Conditionally render different DOM Elements in Lightning Web Component. The power of Lightning web components comes from its ability to display dynamic or desired DOM Elements based on Conditional directive. Salesforce provides us with two if directives such as <if:true> and < if:false> in order to display desired DOM Element.

Html file(electric.html):

<template>

<div id=”electriccar” if:false={pageload}>

<div>Name: {name}</div>

<div>Description: {description}</div>

</div>

<div id=”regularCar” if:true={pageload}>

<div>Car Type: {cartype}</div>

<div> Mileage:{mileage} </div>

</div>

</template>

JavaScript file(electric.js)::  In Javascript file add details for the property and set pageload property as false as shown below.

import { LightningElement } from ‘lwc’;

export default class Electric extends LightningElement {

name = ‘Tesla’;

description = ‘Tesla is an Electric Car.’;

cartype = ‘Regular Car’;

mileage = ’30’;

pageload = false;

}

The above will display the following output as we have set the pageload property as false in javascript. In our HTML file(electric.html) we are comparing if:false directive with pageload property and since the value of false matches on both sides we are conditionally displaying Name and Description fields.

Try it in playground: https://developer.salesforce.com/docs/component-library/tools/playground

 Reporting Hack: Power of 1 a Game Changer in Reports

count down graphjc background for event opening

In this post, We will try to understand the importance of the Power of 1 field on Standard and Custom Objects which helps in Reporting. We could get high-level stats such as Owner Unique Counts, Unique records owned by a User for Parent objects, etc when we use this field in Reports. In the below example, We will create a Formula field on Standard objects such as Account / Opportunity and User and run a report on Opportunity to get the unique Accounts and unique owners.

Business Requirement: Show Unique Account Count based on Owner in an Opportunity report

Setup –> Object Manager–>Account –> Fields & Relationships –> New–>Forumla–>Account(Power of 1). You can give any desired name as the Field Name.

Make sure you set the Decimal Place as 0.

Enter the value as 1 and click on Next and deselect all Page Layout and provide Visibility to desired profiles.

Similarly, Create the Power of 1 field on Opportunity object as well.

Let’s see the power of 1 in action after adding the Power of 1 field on desired objects. We will try to create Opportunities with Products Report type for our demo purpose. I have selected with Opportunities with Products Report type and applied filters such as Created Date(Current FY) and Opportunity Status as Open.

From the above report, I cannot get the Unique count of the Accounts associated with opportunities. This is where Power of 1 field created on the Account record would help us in getting the desired result.

From the above image you could see there are 5 Account records(Duplicate) owned by Gary Smith and 9 Account records(Duplicate) owned by Sai Rakesh Puli. With the help of the Power of 1 formula field, we were able to get the Unique Account records owned by Gary Smith and Sai Rakesh Puli. These stats are very helpful when we want to get unique counts from any objects that we join on the Reports.

 Understanding Wait Activity Attributes in Journey Builder

In this article, we will try to understand different Wait Activity attributes in Journey builder. Journey builder helps to define  1 to 1 relationship or personalized Customer/Prospect Journey using multiple channels such as Email, Mobile, PhysicalMail/Direct Mail, and Social Media.

Wait Activity is a configurable period that contacts are held between other Activities. During Wait Activity, Marketing cloud evaluates marketing cloud Contacts. We have to configure the time zone for the Wait Activities.

There are 3 different types of Wait Activities

  1. Wait by Duration
  2. Wait Until Date
  3. Wait by Attribute

Wait by Duration: This is the default wait activity that is displayed on the Canvas with the duration as 1 day. We Could increase the Wait duration to desired Days/Weeks/Months. Optionally we could also specify a time to make the Marketing contacts wait until the desired time by selecting “Extend Wait duration until specific time”.

Wait Until Date: Marketing cloud Contacts are held in the wait mode until the Specified date and time. Note: If the Contact reaches the wait activity after the specified date and time then the contact proceeds to the next activity Immediately.

Wait By Attribute: Marketing Cloud Contacts are held until the day and time specified on date attributes. For example, if we want to send a Birthday Email to the Customers/Contacts we could use the Wait By attribute with the date attribute field as Customer Birthdate field which stores customer birthday.

 Block users from Logging into Salesforce during Deployments

In this post, We will try to understand how to stop users from logging into Salesforce when deployments are going on. Have you ever wondered how salesforce displays a maintenance message to users? The answer to this question is Salesforce uses Login flows to display message to users. Lets see how we could use Login flows, Flows and Custom settings(Hierarchical) together to stop users from Logging into Salesforce during deployments.

Step 1: Creation of Custom settings

Create Custom settings with the desired name and for our demo purpose, we will call it Block users from Login. We need to create two fields (BlockUser and DisplayMessage) on the “Block users from Login” custom settings. Block user field will be a checkbox and the Display Message will be a text area field. If the checkbox is selected then the desired message from the Display Message field should be displayed to the users

Step 2: Creation of flow

Create a flow with the desired name. For demo purposes, we will call it as “Block users from Login Service”.

Get Records: From Left side, pane select Get records and drop it on to the main page. In this step, we are trying to get the records from our custom settings(Block users from Login) as shown in the image. Fill in all the mandatory fields and select the first record

Object – Block users from Login

Condition Requirements — Get all Block users from Login

Sort Order – Not Sort

How many records to store – only the first record

How to store the record data – Choose fields and assign Variables (Advanced)

Select Variable to Store Block users from Login

Record – {!blockusers}

Field – BlockUser__c

Field – DisplayMessage__c

After entering all the above information click on Done. With this activity, we have successfully created Get Records.

In the next step, we will try to check if the Block user field on the fetched Custom setting(Block users from Login) is set to true using the decision box.

Decision:: Drag and drop Decision box from Left palette onto the Main page.Enter the desired Label and API Name. In the outcome, order click on + and enter the below details. Give the desired Label and API Name and set the When to execute Outcome as “All conditions are Met”. In the Resource select the blockuser field as true using the Global constant field. Click on Save and connect Get Records with decision box.

Screen: From the Left palette select the Screen and drop it on the main page. Give the desired screen name on the Label/API Name in the Screen properties. In the configure frame select “Show Header” only.

Note: Make sure you unselect the “Show footer” or else users will be able to see a button and then navigate/login to salesforce.

Our end result of the flow should be as shown below and make sure you go back and activate your flow.

Login flows:

Create new Login flow with desired name and select the flow we have created(Block_users_from_Login_Service).  Select userlicense as Salesforce and set the desired profile that we want to block the users from Logging in. In my scenario, I wanted to block Users with Standard User profile from logging into Salesforce.

Testing:

Login with any  Standard profile user and we should be able to see the information message from our custom settings and stop users from navigating into salesforce.

Note: Once the deployment/Maintenance is complete make sure to uncheck the Blockuser__c field to allow users into salesforce

In this article, We will try to understand Custom preference center and also compare it with the Profile Center/Subscription Center to get an overall understanding. Custom Preference Center provides the ability to have both Profile and Subscription information in one page. We can customize the preference center page with desired options that will allow Subscribers to opt-in/opt-out and also provide additional option to Unsubscribe from all the emails. If you are using a Marketing cloud connect then by default we will have to use Custom preference Center.

Ideas to use Preference Center:

  1. You want to provide few options to your Subscribers(Customers/Prospects/etc) such as asking them if they are interested in Getting Emails about birthday Coupons, Networking Events and Promotional Events.
  2. You could also ask customers to opt-in or opt-out into Newsletter, Feedback emails, Conferences, and Events. Since this is a cloud page you could retrieve the preferences from Data extension/Sales cloud and display the subscriber preferences.

You could build custom preference center by using Smart Capture form if you are using Lists. If you are using Data Extensions or Marketing cloud connect then you could use Cloud pages/AmpScript to capture the Subscriber preferences and Update Data Extensions/Sales Cloud /Service Cloud.

Considerations for using Preference Center vs Profile/Subscription Center:

In this post, we will try to Design a process using Custom Metadata settings to run the Batch job at desired intervals.

Custom Metadata Type – Schedule Batch Apex Jobs

This Custom Metadata Type is used to define how often the batch jobs must be executed. This custom metadata type can be used in the future for other batch jobs that will be newly developed in the course of the Salesforce Application Development.

CustomSettings

As seen in the illustration above, the custom metadata type comprises of 3 custom fields namely:

  1. Every X Minutes (API Name: Every_X_Minutes__c): This is a text field. If defined will ensure the batch job is executed every “X” Usually the value for “X” should be multiples of 5 and less than 60. For Eg: 20, 30, 40, 45.
  1. Execute Every Hour (API Name: Execute_Every_Hour__c): This is a checkbox field. If checked, will ensure the batch job is executed every hour in a day.
  1. Hour Of The Day (API Name: Hour_Of_The_Day__c): This is a text field. If defined will ensure the batch job is executed at the Hour of the day mentioned in this field. The hour should be mentioned in the 24-hour format and the values can range between 0 and 23. For Eg: If the batch must be scheduled to run at 1 PM by the scheduler job, then the hour to be mentioned in this field must be 13. Likewise, if it must be scheduled to run at 4 PM then the hour to be specified in this field must be 16.

Validation Rules on the Custom Metadata Type

Validationrule

The above validation rules can be defined on the Custom Metadata Type in order to ensure that multiple inputs aren’t provided when it comes to the batch scheduling time intervals. The intention for this solution to work is to ensure either the batch is configured to be run every “X” minutes, or every hour or at a specific time in the day. In Salesforce, the CRON Expression for scheduling the same would change every for each parameter that is being set depending on whether the batch has to be executed every “X” minutes or every hour in the day or at a specific time in the day.

Like for E.g.: Batch Job could be executed every hour in a day, or the Batch Apex Job could be executed once everyday at some specific time of the day. However, it is very unlikely that batches can be executed every 5 mins or every 10 mins. There may be possibility that batches would need to be executed ad-hoc typically in ISV product development scenarios. Even in that case it may so be the case that a Lightning or Salesforce Classic based button placed on a Lightning Page or Visualforce Page would be invoking the batch job ad-hoc.

Moreover, the downside or overhead of scheduling batch-jobs to run every 5 mins will result in the batch apex job running at different times as it moreover executes in an asynchronous manner. This can result in wrong outputs or results especially if we are calculating totals as aggregate or for that matter provisioning or revoking permissions in the form of permission-sets or share records.

 Setting CRON expression to run batch every ‘X’ Minutes.

As a developer, the idea is to ensure the Scheduler gets the right CRON job expressions each time it must invoke the batch apex job at the regular interval as expected. Always remember that, to execute the batch job every 30 mins the batch job would need to be invoked twice every hour, which technically implies that the scheduler will have to be invoked twice every hour.

System.schedule(‘SchedulerJob – ‘+String.valueOf(Math.abs(Crypto.getRandomLong())).substring(0, 5), ‘0 0 * * * ? ‘, new SchedulerJob());

System.schedule(‘SchedulerJob – ‘+String.valueOf(Math.abs(Crypto.getRandomLong())).substring(0, 5), ‘0 30 * * * ? ‘, new SchedulerJob());

With the above invocation, we will observe that the batch job gets scheduled for every 30 mins that is at the 60th minute or 0th minute of every hour and at the 30th minute in an hour. The Invoker class must call the Scheduler class twice so that the batch job gets called from the Scheduler Job.

System.schedule(‘SchedulerJob – ‘+String.valueOf(Math.abs(Crypto.getRandomLong())).substring(0, 5), ‘0 0 * * * ? ‘, new SchedulerJob());

System.schedule(‘SchedulerJob – ‘+String.valueOf(Math.abs(Crypto.getRandomLong())).substring(0, 5), ‘0 15 * * * ? ‘, new SchedulerJob());

System.schedule(‘SchedulerJob – ‘+String.valueOf(Math.abs(Crypto.getRandomLong())).substring(0, 5), ‘0 30 * * * ? ‘, new SchedulerJob());

System.schedule(‘SchedulerJob – ‘+String.valueOf(Math.abs(Crypto.getRandomLong())).substring(0, 5), ‘0 45 * * * ? ‘, new SchedulerJob());

In the above e.g. if let us say we set the custom metadata type in such a manner that we would like to run the batch job through the Scheduler Job, then in such a case the Scheduler will need to be invoked 5 times i.e. at 0th minute in the hour, 15th minute in the hour, 30th minute in the hour and at the 45th minute in the hour.

public class InvokeScheduler {

public void SchedulerMethod() {

     Schedule_Batch_Apex_Jobs__mdt scheduleBatchMDT = [SELECT Every_X_Minutes__c, Execute_Every_Hour__c, Hour_Of_The_Day__c FROM Schedule_Batch_Apex_Jobs__mdt WHERE DeveloperName = ‘Unique Name of the Custom Metadata Type data record‘];

     String CRON_EXPR;

     if((scheduleBatchMDT.Every_X_Minutes__c != null) || Test.isRunningTest()){                         

        if(Test.isRunningTest()){

         scheduleBatchMDT.Every_X_Minutes__c = ’20’;

        }

         list<String> CRON_ SubStringList = new list<String>();

         Integer X_MINS = Integer.valueOf(scheduleBatchMDT.Every_X_Minutes__c);                                                                   

         Integer Interval = Integer.valueOf(60/X_MINS);

         system.debug(‘*** Interval ===> ‘+Interval);

         if(Interval == 1){

             Interval += 1;

         }

         Integer seriesVal = X_MINS;

         Integer seriesCount = 0;

         for(Integer i = 0; i < Interval; i++){ 

             seriesCount += seriesVal;

             if(seriesCount < 60){

                 CRON_SubStringList.add(String.valueOf(seriesCount));

             }

         }

         for(String schedulerMinsInterval : CRON_ SubStringList){

             CRON_EXPR = ‘0 ‘+schedulerMinsInterval+’ * * * ?’;               

             scheduleBatch.scheduleMe(CRON_EXPR);

         }

     }

     if((scheduleBatchMDT.Execute_Every_Hour__c != false) || Test.isRunningTest()){                       

         CRON_EXPR = ‘0 0 * * * ?’;

         system.debug(‘*** I am going to execute the batch every hour ****’);

         scheduleBatch.scheduleMe(CRON_EXPR);

     }

     if((scheduleBatchMDT.Hour_Of_The_Day__c != null) || Test.isRunningTest()){                    

         if(Test.isRunningTest()){

             scheduleBatchMDT.Hour_Of_The_Day__c = ’21’;

         }

         CRON_EXPR = ‘0 0 ‘+scheduleBatchMDT.Hour_Of_The_Day__c+’ * * ?’;

         scheduleBatch.scheduleMe(CRON_EXPR);

     }                              

}

}

The above class is the invoker class that ensures that several instances of the scheduler are instantiated to run at the specific time in regular intervals. This class is responsible for preparing the CRON expression that will define how often the Scheduler Jobs must be instantiated, which in turn will be responsible for invoking the batch. Before calculating the CRON expression it is important to understand what setup as part of the metadata-type has been configurational settings.

The Interval variable is being used to identify the iterations we need to go through to build on the CRON Expression. CRON_ SubStringList must be a list collection to determine the series in minutes how often the scheduler must be scheduled because the list collection helps us get the string in an ordered manner vs. a set collection because the set is an unordered collection.

Can we call the Scheduler class from the Scheduler class itself (Recursion calls)?

We Cannot do this because the Scheduler goes into an infinite loop of a never-ending state and keeps spawning new processes.  Some blogs also suggest to hit the CRONTrigger repository table to find the immediate last scheduler’s Job ID and to abort it. This will not be the actual solution to this problem. This is the very reason it makes sense to control the Scheduler job invocation from an external class, which in this case will be the InvokeScheduler Apex Class.

global class scheduleBatch implements Schedulable {             

global static void scheduleMe(String CRON_EXPR) {

     System.schedule(‘ Schedule Job – ‘+String.valueOf(Math.abs(Crypto.getRandomLong())).substring(0, 5), CRON_EXPR, new scheduleBatch());

}

global void execute(SchedulableContext sc) {

    Database.executeBatch(new actualBatchJobName(), 200);    

}    

}

The above Apex class is none other than the actual Scheduler Class that is implementing the Schedulable interface. This class has two methods one is the execute( ) method that runs in the Schedulable Context and the other i.e. scheduleMe() is the static method which is invoked from the InvokeScheduler class, to ensure the scheduler is instantiated dynamically depending upon the no of instance that has to be spawned based on the Custom Metadata Type setup.

In this article, we will try to understand the basics of AmpScript and insert records into Data Extension. In our previous article, we learned how to personalize using Ampscript. Ampscript code starts with %% and ends with %%. For example, if we wanted to get the dynamic value of Email from Data extension we can use %%Email%% on the Email template or on the cloud page. In case of Cloud page if we want to write an ampScript block then it starts with %%[ and ends with ]%%.

%%[  

/** Start of Variable Declarations **/

var @newName,@newEmail,@newPhone ,@website,@Submit

/** End of Variable Declarations **/

/** Start of Setting Dynamic/static values to desired variables **/

set @newName = RequestParameter(“name”)

set @newEmail = RequestParameter(“email”)

set @newPhone = RequestParameter(“phone”)

set @website = ‘Home Page’

set @Submit = RequestParameter(‘submit’)

/** End of Setting Dynamic/static values to desired variables **/

/** Validate if the variable has got the desired value before inserting into Data extension **/

If @submit == ‘Submit’ Then

InsertDE(“Customer_Form”,”Email”, @emailAddress, “Name”, @newName,”Phone”,@newPhone)

Endif

]%%

<label>Name:</label> <input name=”name” type=”text” value=”%%Name%%” />;

<label>Email:</label> <input name=”email” type=”text” value=”%%email%%” />

<label>Phone:</label> <input name=”phone” type=”number” value=”%%phone%%” />

<input name=”submit” type=”submit” value=”Submit” />

In the above code snippet, we have started our ampscript code with %%[ and then we declared variables using var keyword. Every variable must be declared/starts with @. We have declared newName, newEmail, newPhone for demo purposes.

var @newName,@newEmail,@newPhone,@website,@Submit

After declaring the variables the next step is to populate the variables with some values. We can set any value to the variable as it can hold text or numbers. In our scenario, we want to collect all the dynamic values such as name, email, phone and submit from the form which the customer has entered.  RequestParameter function in Ampscript helps in getting dynamic values from the form.

set @newName = RequestParameter(“name”)

set @newEmail = RequestParameter(“email”)

set @newPhone = RequestParameter(“phone”)

set @website = ‘Home Page’

set @Submit = RequestParameter(‘submit’)

After the Declaration of variables and setting/getting the values into variables, our next step is to use the variables and send it to data extension.

We always have to do conditional based logic to avoid unnecessary calls/errors to data extensions. To implement conditional logic we can use If condition/If else condition similar to other programming/Scripting languages. The syntax is a little different when compared to other programming/Scripting languages.

If (condition is true) Then

do this

Endif

In our scenario, we are checking if @submit variable is populated with value as submit then insert into data extension

If @submit == ‘Submit’ Then

InsertDE(“Customer_Form”,”Email”, @emailAddress, “Name”, @newName,”Phone”,@newPhone)

Endif

InserDE syntax helps in inserting data into Data Extension. The first parameter takes the data extension name and is a required field. We also need a column name on the Data extension for which the value has to be inserted.

Syntax

InsertDE(1, 2, 3)

OrdinalTypeDescription
1stringRequiredName of the data extension from which to insert the specified row
2stringRequiredColumn name used to build insert clause
3stringRequiredColumn value used to build insert clause

Usage

InsertDE(‘SomeDE’,’FirstName’,FirstName, ‘LastName’,LastName, ‘CreatedDate’,NOW())

In this article, we will try to understand why do we need segmentation and various ways we can segment data. Segmentation is targeting a desired set of Audiences from all the available Subscribers. For Example, if you want to launch a promotion to existing customers within the age group of 30 to 50 yrs then you could use age as your criteria to send promotional messages only to those customers instead of all the available customers. This would help you to target the right audiences and also help in a fewer number of Unsubscribes or IP Reputation. Segmentation also helps in increasing Higher Conversion rates and also Higher Clickthrough Rates.

Some of the business use cases for Segmentation are: 

  1. Segmentation could be done based on Age, Gender, Geolocation, Geographic location to Send personalized marketing messages.
  2. Segmentation could also be done based on Purchase History to show related Products.
  3. You could also filter desired Subscribers based on Personal Interests to promotional events.

The above-mentioned Scenarios are for illustration purposes and you could filter the audience based on any desired criteria.

Segmentation Tools:

  1. Segmenting Lists – Lists hold Subscriber information. Segmenting Lists can be done by using Group and Data filter.
  2. Segmenting Data Extensions – Data extensions can hold subscriber information or Related data. Segmenting Data extension can be done by using queries and Data filters.

 Setup JEST unit Testing Framework for Lightning Web Components

In this post, We will try to learn to set up JEST unit Testing Framework for Lightning Web Components.

What is Jest ?

Jest is a powerful tool with rich features for writing JavaScript tests. Jest can collect code coverage information and supports mocking to help isolate tests from complex dependencies. Jest tests don’t run in a browser or connect to an org, so they run fast. Use Jest to write unit tests for all your Lightning web components. To run Jest tests for Lightning web components, you need the @salesforce/sfdx-lwc-jest Node module in your Salesforce DX project.

Installing SFDX-Jest

The following JSON is an example of the package.json . Before running the Jest installation setup command from the VSCode Terminal or the node module installation script please ensure that this json file is part of the VSCode SFDX project. The reason I say this is because this json file is parsed when the node module installation script is executed in a MS Windows OS based system.

Note: Always ensure the name attribute on the json matches the project folder name as seen on the Project Explorer in VSCode.

Before issuing the node based scripts, we need to ensure NodeJS is installed in our PC/Laptop. For this ensure you have NodeJS installation done by downloading the Windows equivalent installation file from NodeJs website. For Mac/Linux we can install the same through the terminals directly.

Once you are sure that NodeJS has been successfully installed, issuing the command sfdx force:lightning:lwc:test:setup on the VSCode terminal will result in a couple of warning deprecated details from npm. At times, in some Windows OS based machines we will get an error when this command is executed stating due to vulnerabilities it was not possible to complete the package installation in Windows based machines entirely. However, the screenshot does show that test setup got completed with vulnerabilities observed.

You can also run the script directly from where it was installed in the project using the following Node command. In the Visual Studio Code terminal, in the top-level directory of the Salesforce DX project, enter the following command.

node node_modules/@salesforce/sfdx-lwc-jest/bin/sfdx-lwc-jest

Doing this will setup the node_modules directory locally in the project directory wherever this node modules script is being executed.

While issuing the sfdx command in some of the SFDX projects the absence of the package.json file can result in the sfdx command sfdx force:lightning:lwc:test:setup failing. The error message would signify something like this:

Sometimes we may need to run the following commands in the destined order as mentioned below, to ensure lwc-jest gets installed and setup properly on Windows-based machines.

1st: npm install –global –production windows-build-tools

2nd : npm install @salesforce/lwc-jest –save-dev

Both the above commands can be run via GiT For Windows Shell or the PowerShell in a Windows based machine. For the above command ensure the PowerShell is run in the Administrator User’s context as follows by clicking on the Start menu button.

This step will ensure that lwc-jest framework has been successfully installed. This issue was faced by me when I was working on the LWC Unit Testing related Salesforce Trailheads. So I thought of covering up this topic on how I managed to fix the issue in this blog.

Automate Test Scripts with Package.json and npm

A goal of having unit tests is to promote developers to write and run them as part of their development and continuous integration process so that bugs are identified and fixed sooner rather than later. Having to remember and type long commands like the one above over and over is counterproductive to your goal. Here’s where automation comes in.

npm has some great out-of-the-box script automation flexibility. Running the install earlier added a series of options to the scripts property of the package.json file at the root of your project.

{

  “name”: “test-lwc”,

  …  “scripts”: {

“test:unit”: “sfdx-lwc-jest”,

“test:unit:watch”: “sfdx-lwc-jest –watch”,

“test:unit:debug”: “sfdx-lwc-jest –debug”,

“test:unit:coverage”: “sfdx-lwc-jest –coverage”,

  },

  …}

If you want to run all tests for your project, run this npm command from the base directory of your project.

npm run test:unit

Run Tests Continuously During Development

To run all tests for a single component every time you save changes, change directories to the component directory and run the npm command below that utilizes sfdx-lwc-jest with the –watch parameter. As mentioned above you could also run this from the base of the project and have all tests in the project run for every change. Git needs to be initialized for –watch to work from the command line.

npm run test:unit:watch

With Git initialized, Jest now watches all component files for updates and runs all relevant tests every time it detects a change.

Run Tests in Jest Debug Mode

To run the project’s Jest tests in debug mode, run the npm command below that utilizes sfdx-lwc-jest with the –debug parameter.

npm run test:unit:debug

Run Tests and Display Code Coverage

To see the code coverage of the tests, use the –coverage option below.

npm run test:unit:coverage

Post Contributor:

Rakesh Ramaswamy

https://www.linkedin.com/in/rakesh-ramaswamy-38062385/