Saturday, 30 May 2015

Intro to Unit Test Data Creation Framework continued.......

Read the first blog at

In my last blog I introduced my Unit Test Data Creation Framework, now we will start building the classes of the framework.

First create the Constants class

public class Constants {

public static final String CONST_Account = 'ACCOUNT';
public static final String CONST_Contact = 'CONTACT';


Now, in the ITestData class add

public interface ITestData {
    List<sObject> returnAnyObject(String jsonStr, KeyValue[] kVals);                                                 

Next, in the TestDataJsonLibrary class add

public class TestDataJsonLibrary {

public static String referenceKey = 'ReferenceID';
public class Standard{

public final Map<String, String> libraryMap = new Map<String, String>{
Constant.CONST_Account                            => '{"attributes":{"type":"Account"},"Field1__c":"Value 1","Field2__c":"Value 2"}',
Constant.CONST_Contact     => '[{"attributes":{"type":"Contact"},"'+referenceKey+'":"Reference Value","Field1__c":"Value 1"}



In the Return Data class add

public abstract class TestDataReturnData implements ITestData{

public Boolean bulkModeOn = false;

public Map<System.Type, String> overrideJson = new Map<System.Type, String>();

    public List<sObject> returnAnyObject(String jsonStr, KeyValue[] kVals){

        List<sObject> sobj;
                                    jsonStr = getFilteredJsonString(jsonStr, kVals);
            sobj = (List<sObject>) System.Json.deserialize(jsonStr, List<sObject>.class);
            sobj = (List<sObject>) System.Json.deserialize('['+jsonStr+']', List<sObject>.class);
        if(kVals != null){
                    for(sObject obj : sobj)
                       obj = UtilDML.setObjData(obj, kVals);
        return sobj;

    private String deserialJson(String jsonStr, KeyValue[] kVals){   
            List<Object> deserialLst = (List<Object>) JSON.deserializeUntyped(jsonStr.unescapeEcmaScript());
            String aReferenceKey;
            //when setting the fields from the KeyValues if 1 is the lookup field set to  aReferenceKey          
            for(KeyValue kv : kVals){                
                        if(kv.key == TestDataJsonLibrary.referenceKey){                           
                                    aReferenceKey = kv.value;
            List<sObject> serialLst = new List<sObject>();
            for(Object obj : deserialLst){
                        Map<String, Object> objMap = (Map<String, Object>) obj;

                        if(aReferenceKey == objMap.get(TestDataJsonLibrary.referenceKey)){                                       
            return JSON.serialize(serialLst);


In the Insert Data class add

public virtual class TestDataInsertData extends TestDataReturnData{

    private sObject insertAnyObject(String jsonStr, KeyValue[] kVals, System.Type objType){
        if(overrideJson != null && overrideJson.containsKey(objType))
            jsonStr = overrideJson.get(objType);
        sObject sobj = super.returnAnyObject(jsonStr, kVals)[0];
        // set to true if inserting multiple records
        if(bulkModeOn == false)
            insert sobj;
        return sobj;

    public Contact insertContact(String jsonstr, KeyValue[] kVals){       
        return (Contact) insertAnyObject((jsonstr != null && jsonstr != '') ? jsonstr : new TestDataFramework_JsonLibrary.Standard().M.get(Constants.CONST_Contact), kVals, Contact.class);


In the Update Data class add

public virtual class TestDataUpdateData extends TestDataInsertData{

    public Account updateAccount(KeyValue[] insertkVals, KeyValue[] updatekVals){      
        Account acc = super.insertAccount(insertkVals);
        if(updatekVals != null)
            acc = (Account) UtilDML.setObjData(acc, updatekVals);
        update acc;
        return acc;


In the ComplexData class add

public virtual class TestDataComplexData extends TestDataInsertData{
    public Account acc{get;set;}
    public Contact cont{get;set;}
    public Opportunity opp{get;set;}

    public Account insertContactAndAccount(Map<System.Type, List<KeyValue>> keyMap){
        //inserts just 1 Account and 1 Contact and links them together, using the map in the argument means you only need to use 1 argument

        //stops a null exception occurring later in the code
        if(keyMap == null){
            keyMap = new Map<System.Type, List<KeyValue>>();
            kMaps.put(Contact.class, new List<KeyValue>());
        }else if(keyMap.containsKey(Contact.class) == false)
            keyMap.put(Contact.class, new List<KeyValue>());
        this.acc = super.insertAccount(keyMap.get(Account.class));

        // now provide the Id into the KeyValues to link the Objects together
        keyMap.get(Contact.class).add(new KeyValue('AccountId',, 'ID'));

        this.cont = super.insertContact(keyMap.get(Contact.class));
        return this.acc;

    public Account insertContactOpportunityAndAccount(Map<System.Type, List<KeyValue>> keyMap){


            if(keyMap.containsKey(Opportunity.class) == false)
                        keyMap.put(Opportunity.class, new List<KeyValue>());

        keyMap.get(Opportunity.class).add(new KeyValue('AccountId',, 'ID'));
        this.opp = super.insertOpportunity(keyMap.get(Opportunity.class));
        return this.acc;


In the BulkData class add

public virtual class TestDataBulkData extends TestDataInsertData{

    public Account insertAccountAndContacts(Map<System.Type, KeyValueBulk> keyMap){
        //This inserts 1 Account and a number of Contacts

        //stops a null exception occurring later in the code
        if(keyMap == null){
            keyMap = new Map<System.Type, KeyValueBulk>();
            keyMap.put(Contact.class, new KeyValueBulk());
        }else if(kMaps.containsKey(Contact.class) == false)
            kMaps.put(Contact.class, new KeyValueBulk());
        this.conts = new List<Case>();
        Account acc = super.insertAccount(keyMap.get(Account.class) .keyValueBulkLst);
        bulkModeOn = true; // stops records being inserted
                    //link records together
                    (keyMap.get(Contact.class)).keyValueBulkLst.add(new KeyValue('AccountId',, 'ID'));
                    List<KeyValueBulk> kVals = (keyMap.get(Contact.class)).keyValueBulkLst;
                    for(Integer i = 0; i < (keyMap.get(Contact.class)).insertRecs; i++)
                    insert this.conts;
        bulkModeOn = false; //resets flag
        return this.acc;

KeyValue class looks like this

public class KeyValue{

            public String key{get; set;}
            public String value{get; set;}
            public String fieldType{get; set;}

            public KeyValue(String key, String value, String fieldType){
                        this.key = key;
                        this.value = value;
                        this.fieldType = fieldType.toUpperCase();

            public KeyValue(String key, String value){
                        this.key = key;
                        this.value = value;

            public KeyValue(){

KeyValueBulk class looks like this

public class KeyValueBulk{

public integer insertRecs;
public KeyValue[] keyValueBulkLst;

public KeyValueBulk(integer insRecs, KeyValue[] kys){
            insertRecs  = (insertRecs != null && insertRecs > 0) ? insertRecs : 1;
            keyValueBulkLst = kys;


Also if you have any triggers, for rapid transactional processing create the TriggerController class

To bypass code in triggers which often fire workflows and process builders, which in turn fire triggers again; all of which takes extra processing time. But when you are simply creating test data you are implicitly telling the system to create an exact data set and if you are not actually testing the trigger there is no need when you are creating the test data to run through the code in the trigger.

In the TriggerController class add

public class TriggerController {

            //used specifically in unit test data framework
            public static Map<System.Type, Boolean> rapidProcessing;

            //Account - Disable / Enable parts of trigger
            public static boolean Account_DisableAllTypes = false;
            public static boolean Account_DisableInsert = false;
            public static boolean Account_DisableUpdate = false;
            public static boolean Account_DisableDelete = false;
            public static boolean Account_DisableUnDelete = false;

            //used to unit test the trigger control to ensure the correct parts of the trigger were triggered
            public static boolean Account_Insert_Succeeded = false;
            public static boolean Account_Update_Succeeded = false;
            public static boolean Account_Delete_Succeeded = false;
            public static boolean Account_UnDelete_Succeeded = false;


In the UtilDML class add

public class UtilDML {

            public static Sobject setObjData(Sobject aobj, KeyValue[] kVals){
                        Sobject thisobj;
        if (kVals != null){
            for (KeyValue eachval : kVals){
                                    thisobj = setFieldVal(aobj, eachval);
                        catch(Exception ex){system.debug('## ex ' + ex); }
        return thisobj;

            public static Sobject setFieldVal(Sobject obj, KeyValue thisKeyVal){
                        system.debug('## thisKeyVal ' + thisKeyVal);
                        if (thisKeyVal.fieldtype == 'DATE'){
                                    String tmpDt = thisKeyVal.value;
                                    Date dt = Date.valueOf(tmpDt.substring(0,9).trim());
                                    system.debug('## dt ' + dt);
                                    obj.put(thisKeyVal.key, dt);
                        }else if (thisKeyVal.fieldtype == 'DATETIME'){
                                    system.debug('## Datetime.valueOf(tmpDt) ' + Datetime.valueOf(thisKeyVal.value));
                                    obj.put(thisKeyVal.key, Datetime.valueOf(thisKeyVal.value));
                                    system.debug('## obj ' + obj);
                        }else if (thisKeyVal.fieldtype == 'DECIMAL')
                                    obj.put(thisKeyVal.key, decimal.valueof(thisKeyVal.value));
                        else if (thisKeyVal.fieldtype == 'INTEGER')
                                    obj.put(thisKeyVal.key, integer.valueof(thisKeyVal.value));
                        else if (thisKeyVal.fieldtype == 'LONG')
                                    obj.put(thisKeyVal.key, long.valueof(thisKeyVal.value));
                        else if (thisKeyVal.fieldtype == 'DOUBLE')
                                    obj.put(thisKeyVal.key, Double.valueof(thisKeyVal.value));
                        else if (thisKeyVal.fieldtype == 'BOOLEAN')
                                    obj.put(thisKeyVal.key, ((thisKeyVal.value).toUpperCase() == 'TRUE') );
                        else if (thisKeyVal.fieldtype == 'BLOB')
                                    obj.put(thisKeyVal.key, Blob.valueof(thisKeyVal.value));
                        else if (thisKeyVal.fieldtype == 'ID'){
                                    obj.put(thisKeyVal.key, ((ID)thisKeyVal.value));
                                    obj.put(thisKeyVal.key, thisKeyVal.value);//String
                        system.debug('## obj ' + obj);
                        return obj;

 public static sObject convertToSobject(Map<String, Object> objMap){

     sObject sObj = Schema.getGlobalDescribe().get((String)((Map<String, Object>)objMap.get('attributes')).get('type')).newSObject();
     Map<String, Schema.SObjectField> sObjMap = sObj.getSObjectType().getDescribe().fields.getMap();

     for(String key : objMap.keySet()){
     Schema.DescribeFieldResult field = sObjMap.get(key).getDescribe();
     String fieldType = field.getType().Name();
     String value = (String) objMap.get(key);
if(fieldType == 'DATE'){ 
sObj.put(key, Date.valueOf(value));
}else if(fieldType == 'DATETIME'){
sObj.put(key, Datetime.valueOf(value));
}else if(fieldType == 'DECIMAL'){
//sObj.put(key, Decimal.valueof(value));
sObj.put(key, Decimal.valueOf(value));
}else if(fieldType == 'INTEGER'){
sObj.put(key, Integer.valueOf(value));
}else if(fieldType == 'LONG'){
sObj.put(key, Long.valueOf(value));
}else if(fieldType == 'DOUBLE'){
sObj.put(key, Double.valueOf(value));
}else if(fieldType == 'BOOLEAN'){
sObj.put(key, (value.toUpperCase() == 'TRUE'));
}else{// String
sObj.put(key, value);
     return sObj;


 In my next blog we will start building some examples of using the framework. If you have any questions about the framework please add comments on my blog

Sunday, 24 May 2015

Unit Test Data Creation Framework

Ive worked in so many companies where developers copy and paste code from 1 unit test to another to create test data for the testmethod. Then when they need to add a new field because it is a new required field that is failing tests they realise they've got to copy to all classes.

Some companies improve by having a central class but the functions they make have passed in arguments for each field, so for each field that needs to be passed in requires a change to this class.

Some improve this by passing in a list of a key value pair classes. This will become clearer later when we start coding in my follow on blog to this one.

Some may improve even further by having a for loop to create as many records as you like.

But Ive never seen a company to have all the benefits above plus using json to serialise and de-serialise to / from generic sobjects. Use OOP to provide a logical framework separating functionality that provide only returning Casted Sobjects, inserting single objects, inserting multiple objects for bulk testing, updating objects, inserting multiple objects of different types and linking them together, and inserting multiple objects creating complex linked data structures.

Building into the framework a highly flexible capability so whatever type of test data is required the framework can easily create.

Also to provide an option for rapid transactional processing to speed up unit tests and also speed up deployments.

The Framework can also be used to create data in Salesforce for many purposes such as data migration, exposing for web services etc.

So lets get started.
First create a number of classes

            This is the interface class

            Provides the json strings that will be serialised into sobjects

            Serialises json strings from TestDataJsonLibrary or custom json strings

            DML transactions of the serialised json strings occurs here, but only inserts 1 record

            Inserts and then updates a record

            Inserts multiple records of different Sobjects and links the records together

            Inserts multiple records

            A utility class used to assign values to any fields of any Sobject

            This contains information of fields, their api name, value and data type


            This contains information in KeyValue, how many records to create for each  object and whether or not to bypass code in triggers for rapid transactional  processing ( the default is not to bypass triggers )

            If you have any code in triggers also create this class

In my next blog I will start showing you the coding

Friday, 22 May 2015

Whats good about Salesforce Summer 15 Release

Duplicate Management
Im sure you are now aware of this, if not you need to be it will be invaluable to every organisation.
Maintaining clean and accurate data is one of the most important things you can do to help your organization get the most out of
Salesforce. With Duplicate Management, you can control whether and when you allow users to create duplicate records
inside Salesforce; customize the logic that’s used to identify duplicates; and create reports on the duplicates you do allow users to

DML Options
Set to true to make sure that sharing rules for the current user are enforced when duplicate rules run. Set to false to use
the sharing rules specified in the class for the request. If no sharing rules are specified, Apex code runs in system context and
sharing rules for the current user are not enforced.

Search Namespace
System.Search.find(String) method, which performs dynamic SOSL queries
getSObject()  - Returns an sObject from a SearchResult object.

Generic Model For Daisy Chain Batches

In the Winter 15 release Salesforce released my idea allowing daisy chaining batch jobs, so when 1 batch job finishes a new batch can be started. Now if you have say 3 different batch jobs and at the end of the batch a new batch must be started, or you need a decision making process to decide if to start a new batch or not, you would need to create 3 separate batch classes. So, I've created a design model where only 1 batch class needs to be created, this provides a generic and flexible batch model, as shown below:

global class batchclass implements Database.Batchable<sObject>, Database.Stateful, Database.AllowsCallouts{
    global String batchType;
    global String soql;
    global Boolean success;

    //each time you would normally want to create a new batch class instead simply add a new else if statement in execute() and finish()
    global batchclass(){
        //default batch type
        batchType = Constants.CONST_TYPE1;
    global batchclass(String thisbatchType){
        //pass the batch type to run
    batchType = thisbatchType;
    global batchclass(String thisbatchType, String thissoql){
        //pass the batch type to run and batch requires a soql      
    batchType = thisbatchType;
        soql = thissoql;
    global Database.QueryLocator start(Database.BatchableContext bc) {
        //Some batches will need to retrieve data from the database, use soql for this
    if (soql == null || soql == '')
            return Database.getQueryLocator('Select id From User limit 1');//data will not be used in the execute but User is used because there will always be at least 1 user in the org
            return Database.getQueryLocator(soql);
    global void execute(Database.BatchableContext BC, List<sObject> glbs){
        //decide which function to call depending on the type
    if (Limits.getLimitCallouts() > (Limits.getCallouts() -10) ) {
            if (batchType == Constants.CONST_TYPE1)//Constants just contains static variable strings
                success = Utils.callFunction1();
            else if (batchType == Constants.CONST_TYPE2)
                success = Utils.callFunction2();
    //success  variable deecides whether the batch will be run again

    global void finish(Database.BatchableContext BC){
        //if success = true  decideToRunFunction1Again is called which will either call a different batch job, or the same. It may contain a decision process as well
    //such as each time the execute() runs a custom setting is incremented and the decide functions look at this custom setting to decide if to run the batch again
    if (batchType == Constants.CONST_TYPE1 && success){
        else if (batchType == Constants.CONST_TYPE2 && success){


More Salesforce Spring 15 Release

Set Up Test Data for an Entire Test Class 
Use test setup methods (methods that are annotated with @testSetup) to create test records once and then access them in every test method in the test class. Test setup methods can be time-saving when you need to create reference or prerequisite data for all test methods, or a common set of records that all test methods operate on.

@testSetup static void setup() {
// Create common test accounts
List<Account> testAccts = new List<Account>();

for(Integer i=0;i<2;i++) {
testAccts.add(new Account(Name = 'TestAcct'+i));

insert testAccts;

//The Accounts created in setup() are automatically available in testMethod1() and testMethod2() which makes our test methods easier to make and the tests will run faster and more efficient

static void testMethod1() {

static void testMethod2() {

Chain More Jobs with Queueable Apex 
Queueable Apex was introduced in Winter ’15 and enables you to easily start and manage asynchronous processes. Previously, you could chain a queueable job to another job only once. You can now chain a job to another job an unlimited number of times. For Developer Edition and Trial organizations, your chain can have up to five queueable jobs.

public class AsyncExecutionExample implements Queueable {

public void execute(QueueableContext context) {
// Your processing logic here
// Chain this job to next job by submitting the next job

System.enqueueJob(new SecondJob());  //2nd job queued


More Salesforce Spring 15 Release

Import Accounts and Contacts with Ease
Your Name > My Settings > Import > Import My Accounts & Contacts,
and import from loads of source outlook, linkedin, excel, yahoo, zoho etc

Educate Users with Salesforce Adoption Manager
send tips by email to users to encourage good working processes

Call an Apex Method from a Process
When no other process action can get the job done, add customized functionality to your Salesforce processes by calling an Apex
To call an Apex method, add the Call Apex action to your process and select an Apex class with an invocable method.

Monitor Your Users’ Login and Logout Activity
This could be used to identify user activity and if a user is getting most out of Salesforce and possibly identify user to remove and save on licenses.

Streamline Managed Packages by Deleting Unused Components

Develop Deployable Custom Metadata Types (Pilot)
You can now create custom metadata types and can then create custom metadata that uses these
types’ characteristics. ISVs and enterprise IT departments have in the past emulated custom metadata
types by using list custom settings and in some cases custom objects, but the rows on these items
are data, not metadata, and data can’t be deployed to different organizations.

Also, other features....

Mass Submit for Approval

Creation of Any Object via Workflow Rule - using the new Process Builder

Submit for Approval Through Workflow Rules - using the new Process Builder

Apex Called by Workflow

Allow Changes in Formula Field Value to Trigger Workflow

Implement an "Apex Queue" for Async Processing - Create your own queues for whatever you like

Submit More Batch Jobs with Apex Flex Queue (Generally Available) - Now allows potentially unlimited sequential batches to run

Whats good about Salesforce Spring 15 Release

Open CTI
I'd expect the likes of New Voice Media are not going to be happy about this, but now developers will be able to easily integrate with whatever CTI system you like.

Outages and business continuity
This uses the Salesforce to Salesforce fetaure
Some of the issues I see with this Salesforce to Salesforce only works from a Production /Dev to Production / Dev org, so you wont be able to copy data to say your full sandbox. So you will need a separate Production / Dev org which has the same size of data storage as your Production org, which means a lot of extra cost.
You will also need to make sure that all your meta data is up-to-date across both orgs and to constantly be updating Salesforce to Salesforce subscribe settings. Otherwise your data will not transfer across correctly. All of which will introduce extra work for your company to maintain.
But on a positive note, if you can easily select specific records / objects and export subsets of data then this could be beneficial
For good business continuity do use the current Data Export ability that can schedule objects to be exported in a csv file

Important Fix
A useful feature was released in winter 15 that allows you to Deploy without deleting Scheduled jobs by ticking a box in  Deployment Settings
For some orgs when you go to Deployment Settings you will see Insufficient Privileges
This is a known error but there is a workaround here

Quick Deploy
Also I can confirm that Quick Deploy now fully works and is a great addition for deployments