Search

Saturday, 2 June 2012

How to extract as much performance improvement as possible from unit tests

I urge everyone to start adopting the following unit testing methodology because it will save a huge amount of time in your deployments and ensure quality coding simultaneously.

Some people like Paul Battisson have done a lot of work on using Mock objects to avoid doing DMLs to create test data for your unit tests, but isn't exclusively to help in this area,but this is what I want to concentrate on.
However Pauls work at the moment doesn't help the situation where you want to test your Triggers. Testing Triggers properly should involve bulk testing of 200 record DMLs. The only problem in doing this is every time you deploy every unit test of every Trigger will run 200 record DMLs, this is of course very expensive and will slow down your deployments.
So what we need is a combinationof using Mock objects to test the bulk of your code, but to test your Triggers on first deploy we should test a full 200 record DMLs, and on subsequent deploys to just test a handful of records, say 10 records, which is less expensive and so will speed up your deployments. But with the built in ability to increase the number of DMLs whenever you need to.

It is likely that your trigger tests cumulatively account for 90% + of the DMLs that go on in all your test classes. So adddress this issue and you address the issue around deployment times being so long.

First of all look at Pauls work on Mock objects
https://github.com/pbattisson/Advanced-Force.com-Testing 


Testing Triggers by Luke Emberton and Steven Fouracre 

When deploying a trigger for the first time set a custom setting to 200,
so that up to 200 records are created in your unit test, after
deployment lower this to 10, so that future deployments are not slowed
down but the unit tests are still tested. If you are currently in an
empty sandbox, then the unit test needs to be written to populate the custom setting
with a default value of say10 to test 10 DMLs.

First step is to setup our Triggers correctly. Have a look at this article about the Trigger Pattern
 http://www.embracingthecloud.com/2010/07/08/ASimpleTriggerTemplateForSalesforce.aspx

By testing the trigger I will indirectly test the class, so
you could just test the trigger, but that is not unit testing, because
what happens if the class is called from other classes and the trigger and unit test to the trigger is
removed, there is now potentially nothing covering the class. Also I like to make it obvious where the
unit test is that tests the trigger and class.

So testing the trigger, you need to test bulk DMLs, for the class just test for 1 record. To setup the test data I use the same function to test both trigger and class, or for you classes you could use Pauls Mock objects.

The following example is to test update and deletion triggers, but can be slightly modified for inserts as well, by ensuring whatever value is entered in the CS the number of new records is created.


public static void setupData(integer createData){

cLines = <<do soql>>; //find out how many records exist in this org


if (
createData > cLines.size()){//there isnt enough records in this org, so you need to create more records
 

 for (integer i=1; i<=(createData - cLines.size()); i++){
          ……..blah blah……create any extra records required
  }
 

}
else{
//there are enough records in this org you don't need to create any more
}


}



createData tells the function how many records to setup. The code in the If statement only
creates however number of extra records are required to be created to test the trigger or class.

Of course this means you need to use @istest(SeeAllData=True)

The actual testmethod will likely be very similar for both tests for the trigger and class, except the number you are passing to
setupData()


When testing the class the integer createData is set to 1, when testing the Trigger it is taken from a custom setting ( CS ) in the org. So when you are deploying your project, the CS is set to 200 to fully test your trigger, then it is lowered to say 10, so the Trigger is still tested for bulk operations, but future deployments are not slowed down by trying to create 200 DMLs. Of course the CS can be changed whenever you like for future upgrades to your trigger.

Create a CS called Test_Triggers__c , with field Records__c.
In your trigger test

                    Test_Triggers__c testTrigs = Test_Triggers__c.getinstance(<<object name>>);
                    integer recordCreate = (Integer)((testTrigs != null) ? testTrigs.Records__c : 10);
Now pass recordCreate to setupData()

If your CS is empty it will just set it to 10, otherwise it will pass the number of records to create to setupData(), and this function will soql the number of existing records in the org and work out how many more is needed to create. So after deployment lower your CS from 200 to 10 to continue testing your triggers for bulk operations.

Efficient, easily manageable and quality testing by Luke Emberton and Steven Fouracre
Enjoy!

Monday, 7 May 2012

SeeAllData annotation in Spring 12 Salesforce Release

SeeAllData annotation opens up a number of interesting things to think about.

Of course when unit testing a good developer will create his/her own test data to ensure that your unit tests will equally work in environments (live and sandbox) where there is not enough sufficient data to test.

So with API 24 SeeAllData is set to False as standard, where test methods don’t have access by default to pre-existing data in the organization.
So that's good, right?
Well yes and no. It is also important to test under conditions where there is a lot of data to ensure that your tests equally work. Have you ever come across situations where your unit tests work in your developer sandbox, you deploy to the Full UAT Sandbox and you find that your unit tests no longer work.
This can happen because of many reasons, soqls designed with no limits, limits on soqls are too restrictive and don't pick up the intended data, triggers work on a small number of records, but break on bulk uploads of say 200 records and so on.
So if you only test where SeeAllData is set to False this won't fully test your classes.

So I suggest you create 2sets of unit test methods 1 where SeeAllData  is False, 1 for True. Test methods will be exactly the same, but creation of test data is different.


The isTest(SeeAllData=true) annotation is used to open up data access when applied at the class or method
level. However, using isTest(SeeAllData=false) on a method doesn’t restrict organization data access for that
method if the containing class has already been defined with the isTest(SeeAllData=true) annotation. In this
case, the method will still have access to all the data in the organization.



Say you are testing an insert trigger on the Account.


Set your class to be SeeAllData=false, which is default


Set TestMethod SeeAllData  is False is quite straight forward
Test your classes under an data empty environment, so you need to create
Account thisAccount = new Account(Name='Steves test account');
insert thisAccount;

Set TestMethod SeeAllData  is True
Test your class under full data conditions to ensure your classes still work.
First identify how many Accounts currently exist in the system, there's no point in making the system create more Accounts than it needs to.


Account[] thisAccountTest = [select Id from Account limit 200];
Account[] newAccs = new Account[]{};
if (thisAccountTest.size() < 200){
      for (integer i=0; i<=200 - thisAccountTest.size() ; i++){
             newAccs.add(new Account(Name='Steves test account' + i));
      }
       insert newAccs;
}

You may ask why not just create tests where SeeAllData  is True and create all the Accounts you need to.
Well it's slower to run this test so when you are deploying new code to the live environment this will slow down each deployment. The answer is to force your unit tests to run your test methods that are annotated SeeAllData  is True only during deployment to the live environment and other deployments will only run SeeAllData  is False test methods.
At the moment it's quite difficult to stop SeeAllData is True tests running only on deployment, so the best alternative approach at the moment is to first Validate all your code to Live environment with the SeeAllData is True test class included. So long as your validation passes, now deploy your code without the SeeAllData is True test methods.
I suspect that Salesforce will at some point allow you to run certain test classes only at deployment, but for now this is not possible.

Note:

Test code saved against Salesforce API version 23.0 or earlier continues to have access to
all data in the organization and its data access is unchanged. So if your environment contains a lot of data your tests will run much slower.

Thursday, 3 May 2012

Autonumbers Increment In Unit Tests

Autonumbers increment when running a testmethod in unit tests. So we you go back to the real world the next number produced by an auto number field is 1 more than previous.

Surely not you say.

Sorry, but true. So how do you change this. You have to turn off a setting in salesforce, so you will have to raise a Salesforce Case to set a flag to default=off on specified custom objects.

For more information review 
http://success.salesforce.com/ideaView?id=08730000000Br67AAC

Thursday, 9 February 2012

Dont be fooled by Currency data type

The Currency data type may display values as 2 decimal places, but the actual value stored in Salesforce database can have more decimal places.
If data is uploaded via data loader or similar tool values could be uploaded with more decimal places.
You may see the value in Salesforce with 2 decimal places, but if you perform a soql the value will have multiple decimal places.

Sunday, 22 January 2012

Why Is Continuous Integration Important

Continuous Integration is a software development practice where members of a team integrate their work frequently, usually each person integrates at least daily. Each integration is verified by an automated build to detect integration errors as quickly as possible. 
Most teams find that this approach leads to significantly reduced integration problems and allows a team to develop cohesive software more rapidly. By doing this teams have found that fewer errors occur in their live environment, so users get more confident in their IT department, trust is a very valuable commodity, not easily bought but easily sold and lost.


Developers get to understand quickly where their code may break someone else's, so development does become quicker, but only after the pain is endured to get your codebase to an understandably coherent state.


Continuous Integration cannot be practised in isolation, it must be accompanied with requirements gathering, proper development life cycle to ensure quality of code produced, tested and documented. 
If requirements gathering isn't done correctly, its like building a house with no foundations, building code on something that is not properly understood. Also once the code is made the testers should test specifically on what the requirements were, plus regression testing.
If code isn't quality checked, the most efficient way of coding won't be created and understanding of your codebase shared amongst your team, if code isn't tested well: well it's obvious that the code won't match the requirements gathered, who should test: well do you get your butcher to give you legal advice, no, so you should have qualified testers who understand quality testing, testing can be a thwart relationship between developer and tester, testers understand this and manage the relationship, isn't documented creates a situation whereby future development doesn't understand why previous development was done, so the developer changes the code to what should be best practices only to realise that he has affected another part of the system; this wastes time and causes the users to have less faith in the IT department, only because documentation wasn't done correctly.


This is the minimum, there is more, such as source control.


Yes all of this slows down development, which affects the time to roll-out new projects, but it means that the code being developed is quality and is managed correctly. If the above isn't done developers won't understand and know what is in their live system, so how do you know what is important and what isn't and changing anything can and will have knock-on-effects to other related undocumented parts of the system.


The lesson is, if you aren't doing the above now don't wait until things go wrong, don't be a reactive development department, become proactive and solve the problems now before they do cause problems.

Get Prepared For AppExchange

http://wiki.developerforce.com/page/Security_Review
This page covers things like costs of submitting to AppExchange and important links to test your code to get it ready for the AppExchange

Free testing tools to check your code is written well

http://security.force.com/webappscanner
http://security.force.com/sourcescanner

Of course you can use these tools not just for preapration for the AppExchange but for all your projects

I'm currently working on an App to allow you to document the class references made throughout your codebase so that a class structure diagram can be produced from the output, and for a deployment procedure to be automatically produced by selecting your classes, objects and pages that you want to deploy.
If you would like to have this please contact me.

Friday, 18 November 2011

Serialize Batch Apex

In Force.com you cannot call a batch Apex class from another batch Apex class because batch Apex is a future call. However, you can use Database.Stateful and the finish() method to mimic serialization of batch processes.

After batch 1 is complete in the finish() method this calls startNewBatch() in GeneralUtils which fires the newbacth batch class


However, running this you will see an error
Database.executeBatch cannot be called from a batch or future method.


global class batch l implements Database.Batchable<sObject>, Database.Stateful{

global Database.QueryLocator start(Database.BatchableContext BC){
return Database.getQueryLocator(query);
}

global void execute(Database.BatchableContext BC,
List<sObject> scope){

}

global void finish(Database.BatchableContext BC){

     GeneralUtils.startNewBatch();
}

}

public class GeneralUtils{

 public static void startNewBatch(){
    newbacth batchable = new newbacth();
    Id
newbacthID = Database.executeBatch(batchable);
}