Alexander Brett

Bundling App::SFDC for fun and profit

12 August 2015


Whilst installing perl and App::SFDC along with a (quite large) number of dependancies is fun, effective and powerful, it’s not always the best solution for Salesforce deployment tools. When you’re deploying from throwaway AWS instances or sending your tools to developers in foreign countries who want something that just works now, you may want to provide a ready-to-go bundle of code. Fortunately, programs such as PerlApp provide a pretty good way to achieve this.

I’m going to run through how to bundle App::SFDC to a standalone .exe suitable for deploying and retrieving metadata on a windows machine.

Introduction to PerlApp

The idea behind PerlApp is pretty straightforward: you point it at a script, and it calculates the module dependancies and bundles the perl interpreter along with all required modules into a .exe, which can then be run without a local perl installation - essentially, you run perlapp --exe SFDC.exe C:\perl64\site\bin\ .

Loading prerequisites

Of course, when you do this and run the resulting executable, there are some modules missing - it’s hard to detect all of the prerequisites, especially when they’re being dynamically loaded in. Examples of this are that WWW::SFDC loads in modules by running:

for my $module (qw'
    Apex Constants Metadata Partner Tooling
    has $module,
      is => 'ro',
      lazy => 1,
      default => sub {
        my $self = shift;
        require "WWW/SFDC/$"; ## no critic
        "WWW::SFDC::$module"->new(session => $self);

In a similar way, when you create a screen appender for Log::Log4perl , it quietly loads in Log::Log4perl::Appender::Screen. To fix this sort of issue, we add a few more arguments to perlapp:

perlapp  --add MooX::Options::Role^
 --add App::SFDC::Role::^
 --add Log::Log4perl::Appender::Screen^
 --add WWW::SFDC::^
 --exe SFDC.exe C:\perl64\site\bin\

Fixing SSL certification

Perl isn’t great at picking up a system’s SSL settings, especially installed certificates - and when the entire purpose of a script is to send HTTPS requests, it’s something that you just have to get right - lest you get errors like 500 C:\Users\ALEXAN~1\AppData\Local\Temp\pdk-alexanderbrett/Mozilla/CA/cacert.pem on disk corrupt at /<C:\Dev\App-SFDC\SFDC.exe>WWW/ line 66..

One successful workaround I’ve found to this sort of error, which works whenever curl is installed, is to use curl’s Certificate Authority file instead of perl’s. You can find this by running curl -v >nul and looking for the lines like:

* successfully set certificate verify locations:
*   CAfile: C:\Program Files (x86)\Git\bin\curl-ca-bundle.crt

Then, you set HTTPS_CA_FILE=C:\Program Files (x86)\Git\bin\curl-ca-bundle.crt and your HTTPS connections start working again. This amount of manual faffing around is more than most developers or AWS images want to do, and fortunately PerlApp has our back again - we can bind in arbitrary files, and specifiy arbitrary environment variables. Let’s add more arguments to perlapp:

--bind certs/cafile.crt[file="C:\Program Files (x86)\Git\bin\curl-ca-bundle.crt",text,mode=666]^
--env HTTPS_CA_FILE=certs/cafile.crt^

Binding Retrieve plugins

Since we’re going to be wanting to use App::SFDC::Command::Retrieve, we need to make sure the plugins and manifests mentioned are, in fact, included. By default they are installed to the perl share/ location, and PerlApp won’t see them! This is how to bind in the default values:

--bind manifests/base.xml[file=C:\perl64\site\lib\auto\Share\dist\App-SFDC-Metadata\manifests\base.xml,text,mode=666]^
--bind manifests/all.xml[file=C:\perl64\site\lib\auto\Share\dist\App-SFDC-Metadata\manifests\all.xml,text,mode=666]^
--bind plugins/[file=C:\perl64\site\lib\auto\Share\dist\App-SFDC-Metadata\plugins\,text,mode=777]^

We should also ensure any dependencies from are loaded:

--scan C:\perl64\site\lib\auto\Share\dist\App-SFDC-Metadata\plugins\

This is the point at which you may want to override these values! If you have specific requirements for your manifests, for the folders you want to retrieve, or anything like that, create your own versions of those files and bundle those in instead.

Why doesn’t it work yet?

This was all great up to v0.13, but now this approach stopped working from v0.14 onwards. At that point I moved from a monolithic everything-in-one package approach to a dynamically loading plugin-oriented architecture, which allows anybody to create a command by naming their package App::SFDC::Command::Foo. The code that makes that happen is:

       wanted => sub {push @commands, $1 if m'App/SFDC/Command/(\w*)\.pm'},
       no_chdir => 1
   grep {-e} map {$_.'/App/SFDC'} @INC;

…and this approach is completely broken by PerlApp - when running this, you get the error invalid top directory at /<C:\Dev\App-SFDC\SFDC.exe>File/ line 472., because PerlApp doesn’t create any recognisable directory structure for bundled modules - it provides an overloaded version of require which gets the required module from somewhere non-obvious.

After trying a few different things, it seems that the simplest way to achieve a nicely-bundled .exe is going to be to write a new script which avoids the pitfalls of detecting commands at runtime. We can, in fact, write a small perl program which writes the script for us (compare the output of this to - it’s the same idea, but static):

use strict;
use warnings;
use 5.12.0;
use App::SFDC;

my $commandArrayDefinition = 'my @commands = ("'
    . (join '","', @App::SFDC::commands) . '");';

say <<'HEAD';
package SFDC;
use strict;
use warnings;

say "use App::SFDC::Command::$_;" for @App::SFDC::commands;

say 'my @commands = ("'
        . (join '","', @App::SFDC::commands)
        . '");';

say <<'BODY';

my $usage = join "\n\n",
    "SFDC: Tools for interacting with",
    "Available commands:",
    (join "\n", map {"\t$_"} @commands),
    "For more detail, run: SFDC <command> --help";

my $command = shift;
exit 1 unless do {
    if ($command) {
        if (my ($correct_command) = grep {/^$command$/i} @commands) {
        } else {
            print $usage;
    } else {
        print $usage;


Tying it all together

Using perl -x in a batch file, we can combine the perl script-writing script and the call to PerlApp into one easy-to-digest package, by using some syntax like:

perl -x %0 >

perlapp ^
 --info CompanyName=Sophos;LegalCopyright="This software is Copyright (c) 2015 by Sophos Limited This is free software, licensed under the MIT (X11) License"^
 --norunlib --force --exe SFDC.exe

goto :endofperl

use strict;




For a full version, I’ve created a gist to play with.

Tags: SFDC Perl

Logging::Trivial - or, why to hold off on that logging module you wrote

03 May 2015

A while back, I wrote WWW::SFDC as well as a few programs calling it, and I wanted the world’s most trivial logging module which still allowed for 5-level (DETAIL, DEBUG, INFO, WARN, ERROR) logging. I couldn’t find anything appropriate on cpan, so I rolled my own and called it Logging::Trivial.

Now, I was pretty happy with this, and got it all ready to be a grown-up cpan module (my first), so I went on prepan and said, guys, what do you think. I wouldn’t call it a slap-down, but I got some pretty robust advice not to publish Yet Another Logging Module, and as a result I decided that I’d sit on it - I wouldn’t refactor it out until I found a suitable replacement, but I wouldn’t publish.

Today I revisited the issue and found Log4Perl’s easy-mode, and I’m very pleased because it does exactly what I want, with very, very little rewriting of code. I ran perl -i.bak -pe 's/Logging::Trivial/Log4Perl ":easy"/; s/DETAIL/TRACE/; s/ERROR/LOGDIE/;' and was essentially done.

I think the moral of the story is that when you’re not quite sure that your solution is going to stand the test of time, wait a while to see whether it does. In my case, it didn’t, but I’m better off for that, and so is cpan.

Tags: Perl

Creating attachments with WWW::SFDC

27 April 2015

At my company, we have a problem with attachments; namely, one of the things we have to do most releases is update some mail merge templates which are stored as attachments against Salesforce records. Historically, this has been a great cause for error; at some point the release manager ends up with a sea of .docx files in some directory or other, having to remember which ones she’s already uploaded, which ones need to be uploaded against which records, and so on.

Checking these files into source control helped; now, instead of a soup of randomly named files kicking around, at least she knows she has the latest version, and has an easy way to ascertain which ones have changed. It’s still a tedious and error-prone manual process though, and one slip-up means that a distributor recieves a million dollars worth of junk on their license schedule.

Fortunately, the correct automation solution is at hand: WWW::SFDC to the rescue. Let’s imagine we’ve been sensible and we’ve stored a load of .docx and .xslx files inside source control already, and with even more foresight, we have named each file the same as the Name on the Doc_Template__c record against which we need to store them. What we need to do is create Attachment records for each changed document against each relevent record.

Let’s automate it in 10 easy steps.

  1. Get the changed files:

     git diff --name-only --diff-filter=AM --relative attachments/Doc_Template__c/
     # Or, if you'd rather just have everything (for instance, populating a new sandbox)
     git ls-files attachments/Doc_Template__c/
  2. Basic perl script setup, including making sure creds are set up:

     use strict;
     use warnings;
     use Getopt::Long;
     use WWW::SFDC::Partner;
     use File::Slurp 'read_file';
     my %creds = (
       url => ''
       'p|password:s'    => \$creds{password},
       'url:s'           => \$creds{url},
       'u|username:s'    => \$creds{username};
  3. Read in the changed files, checking that they exist and trimming whitespace:

     my @filenames = grep {chomp and -e } <>;
     exit unless scalar @filenames; # if there aren't any, we have no work to do.
  4. Parse the filenames and read the files. We’ll store the file contents as base64-encoded data which is what the partner API expects up to provide in the body field. Fortunately, SOAP::Lite makes this a breeze, via the SOAP::Data->type() method.

     my @documents = map {
         ? +{                             # +{} forces this to be a hashref, rather than an arbitrary code block
             name => $1,
             extension => $2,
             body => SOAP::Data->name(
                     body => read_file $_ # read_file is exported by File::Slurp and does what it says on the tin
                 )->type('base64')        # prepackaged SOAP::Lite magic.
         : ();                            # return empty list. 
     } @filenames;
  5. We’re need the IDs of the Doc_Template__c records to store these against, so we’ll start by constructing a where clause…

     my $whereClause = join " OR ", map { "Name='$$_{name}'" } @documents; # that was easy
  6. …and we’ll go and execute that query.

     my @parentRecords = WWW::SFDC::Partner->instance()->query(
       "SELECT Id, Name FROM Doc_Template__c WHERE $whereClause"
  7. We’re going to need to look up IDs from the name, so we’ll create a hash:

     my %parentIds = map { $_->{Name} => $_->{Id} } @parentRecords;
  8. The interesting bit. From each document, create an Attachment suitable for passing into a create call.

     my @recordsToCreate = map {
         $parentIds{$_->{name}} # check that the record exists;
         ? +{
             type => 'Attachment',
             ParentId => $parentIds{$_->{name}},
             name => "$$_{name}.$$_{extension}",
             body => $_->{body},
         : ()
     } @documents;
  9. Wrap it all up with a create call

  10. Put it all together (tada!):

    git diff --name-only --diff-filter=AM --relative attachments/Doc_Template__c/^
     | perl -u myUsername -p passwordPlusToken

Now, this may feel trivial. However, having repeatable, automatic, guaranteed-error-free deployments of templates every month saves up hours of effort on release day, and hours of tracking down bugs later on.

Tags: SFDC Perl

Running unit tests using WWW::SFDC

02 February 2015

This post is going to start basic and get gradually more complicated - I suggest that you stop once you’ve satisfied your own requirements! The basic premise is that, in order to keep a consistantly high-quality and deployable set of metadata, we keep a org up to date with a branch, and nightly (or more often) we run every unit test against it. This makes sure that, even though developers are keeping abreast of the effects their changes have on other parts of the code base and so on, there is one extra highly visible and reportable checksum on quality.

Fortunately, the tooling API makes our job quite easy. If you have simple needs, the following 3 statements should be sufficient for running your tests:

my $parentJobId = WWW::SFDC::Tooling->instance()->runTestsAsynchronous(
  map {
  } WWW::SFDC::Tooling->instance()->query(
    "Select Id FROM ApexClass WHERE NamespacePrefix = ''"

sleep 60 while WWW::SFDC::Tooling->instance()->query(
  "SELECT Id, Status, ApexClassId FROM ApexTestQueueItem"
  . " WHERE ParentJobId = '$parentJobId' AND ("
  . " Status='Queued' OR Status='Processing'"
  . " OR Status='Preparing' OR Status='Holding')"

my @results = WWW::SFDC::Tooling->instance()->query(
  "SELECT Id, Outcome, MethodName, Message, StackTrace, ApexClass.Name "
  ."FROM ApexTestResult WHERE AsyncApexJobId = '$parentJobId'"

However, this is far from a perfect implementation. The first thing to notice is that we’re just passing every Apex class into runTestsAsynchronous, which is probably a highly inefficient way to do things (according to my measurements, it adds about 5% to the total time for the tests). It would be more elegant and quicker if we filtered the results of the first query to find tests classes, and, fortunately, we can do this using the SymbolTable field on ApexClass - a class needs to be enqueued if it, or any of its methods, have the TEST modifier. This can be achieve thus:

sub isTestModified {
  my $thing = shift;
  defined $thing->{modifiers} and (
    $thing->{modifiers} eq 'TEST'
    or (
      ref $thing->{modifiers} eq 'ARRAY'
      and grep {$_ eq 'TEST'} @{$thing->{modifiers}}

sub filterTests {
  defined $_->{SymbolTable}->{methods} and (
      ref $_->{SymbolTable}->{methods} eq 'ARRAY'
      and grep {isTestModified($_)} @{$_->{SymbolTable}->{methods}}
    ) or (
      ref $_->{SymbolTable}->{methods} eq 'HASH'
      and isTestModified($_->{SymbolTable}->{methods})

my $parentJobId = WWW::SFDC::Tooling->instance()->runTestsAsynchronous(
  map { $_->{Id} } grep { filterTests } WWW::SFDC::Tooling->query(
    "Select Id, SymbolTable FROM ApexClass WHERE NamespacePrefix = ''"

Now, you’re pretty happy with how this is working, but sometimes this query seemingly-randomly times out. The reason this happens is that if you have deployed one or more Apex classes since the last compilation, requesting the SymbolTables triggers a behind-the-scenes recompilation of your entire code base, which will take longer that the 120s timeout once you get to a certain size. My solution to this was just a brute-force retry mechanism, which can also mitigate any brief networking issues on the client (it suck if you’re halfway through a several-hour test run and you get no results because the was a momentary VPN lapse…), and I achieved it by replacing all the calls to WWW::SFDC::Tooling->instance()->query() with retryQuery():

sub retryQuery {
  my $query = shift;
  my @result;
  for (0..5) {
    eval { @result = WWW::SFDC::Tooling->instance()->query($query); };
    next if $@;
    return (scalar @result == 1 and not defined $result[0] ? undef : @result);
  die "There was an error retrieving the information from Salesforce\nQuery: $query\nError: $@";

Bear in mind that I’ve recently modified WWW::SFDC to automatically re-authenticate in the event of a session timeout, which is extremely useful for long test runs!

At this point you feel ready to put your code on a continuous integration server and let it rip. When I did this I ran into a wierd problem where the perl process used up all of the available RAM and started thrashing swap space, taking 7 minutes to even start off the tests; it turns out that the SymbolTable for every class, all at the same time, is quite a mouthful to deserialise. The obvious solution is to query in batches, but the tooling API, as far as I can tell, does not support paged queries in the same way as the Partner API does. Queue more grep {} map {} grep {} chaining:

my $parentJobId = WWW::SFDC::Tooling->instance()->runTestsAsynchronous(
  map { $_->{Id} } grep { filterTests } map {
	"SELECT Id, SymbolTable FROM ApexClass WHERE NamespacePrefix = ''"
	. " LIMIT 200 OFFSET $_"
  } grep {
    $_%200 == 0
  } 0..(scalar retryQuery(
    "Select Id FROM ApexClass WHERE NamespacePrefix = ''"
    ) - 1)

Now we have a robust and efficient way to run all of our unit tests on SFDC. The last thing I wanted to do was to have all of these test results aggregated and reported on with Atlassian Bamboo. Bamboo comes with a build-in JUnit parser, and JUnit has a fairly simple syntax, so for me the path of least resistance was to be a terrible-person and roll my own XML formatter:

sub jUnitFormat {
  my $result = shift;
  my $className = $$result{ApexClass}{Name};
  my $methodName = $$result{MethodName} eq "<compile>"
     ? "CompileFailed"
     : $$result{MethodName};
  return "<testcase name='$methodName' classname='$className' "
    . "status='$$result{Outcome}'>"
    . (defined $$result{Message}
      ? "<failure><![CDATA[$$result{StackTrace}\n$$result{Message}]]></failure>"
      : "")
    . "</testcase>";

  open my $fh, ">", $output;
  print $fh join "\n",
    '<?xml version="1.0" encoding="UTF-8"?>',
    '<testsuite name="SFDC Unit Tests">',
    (map {jUnitFormat($_)} @results),
  close $fh;

…easy peasy.

I’ve uploaded the final version as a runnable perl script as a gist - I very much encourage you to give it a try, and maybe even help me come up with more improvements.

Tags: SFDC Perl

Why Perl for Salesforce Tools?

23 January 2015

This explanation is going to take the form of a long story.

When I started writing developer tools for, I got hold of the ant salesforce deployment library and went wild with it, with the ant zip task, loads of exec calls to git, and some extreme contortions to allow me to switch manifest files depending on whether we wanted to retrieve everything or just everything a developer normally needs. I quickly hit some situations where ant just wasn’t expressive enough for my needs:

  • Deploying to multiple environments and adding a -v for validation flag prompted me to write this post about mixing in some javascript with my ant.
  • Our QA org has some different outbound message endpoints from production. Since the GNU coreutils are installed alongside git, using perl for this was a no-brainer:
<exec executable="perl">
	<arg value="-i.bak"/>
	<arg value="-pe"/>
	<arg value="s/&lt;endpointUrl&gt;https:\\\/\\\/.*\\.sophos\\.com\\\/&lt;endpointUrl&gt;https:\\\/\\\/${endpoint}\/"/>
	<arg value="src/workflows/*.workflow"/>

Once I’d got over these, I found that I needed to convert a list of files into a package.xml file. This turned out to be far too complicated for inline scripting or a perl one-liner, so I wrote a separate javascript file called list2xml.js and called it like:

<script src="lib/list2xml.js"
            outputProperty="${xmlBody}" />

However, the javascript I wrote for this turned out to have terrible performance. To process a file you need to:

  • strip off any src/ prefix
  • split up the file path
  • remove the file extension
  • get the metadata type (for a folder called classes/ the metadata type is ApexClass, for instance) which is effectively a hash lookup
  • store each component name against its metadata type
  • dedupe the resulting arrays
  • write out the xml string

This is quite a lot of string manipulation and array munging, and javascript is not great at this, especially running in Nashorn/JDK1.8. I had quite a few developers sitting me about how long all this was taking, so eventually I realised I needed to reimplement it in some faster language. Given the performance hit was in string processing, I came to the obvious conclusion: no language has faster string processing than Perl, and all my users already had it installed! The reimplemented list2xml eventually became

After coming across a few of each of these situations, my codebase was a mess of xml with intermingled javascript, perl system calls, and perl and javascript files. Not only was it inconsistent, it was also entirely untested and essentially untestable, since ant is not designed for heavy lifting and as such has no testing framework, and nor does javascript (well, it probably does, but not a well-known and easy-to-use one). My users were back at my desk demanding fewer bugs.

My boss and I sat down to examine the options: we wanted ease of development and maintenance, high performance and a robust unit testing framework as a minimum. We narrowed the selection down to C# and Perl because of the skills available, and then did a head-to-head on a string processing algorithm; not only did Perl come out conclusively faster, it also had the advantage of not requiring a compilation stage, being usable for our contractors on their macs, and already having critical pain areas implemented in it.

I also found that juggling several different APIs at once was a pleasure using Perl, as was unit testing; I’m feeling pretty happy about the interface I’ve written to the metadata API, and I’m going to demonstrate some really powerful uses for the other APIs over the next couple of weeks.

I was pretty apprehensive about Perl before I started to use it in earnest; most of the reactions I get to it are along the lines of “It’s a write-only language”, “It isn’t maintainable”, “It’s out-of-date” and so on; but I’ve found that it’s quite easy. and in fact quite fun, to write highly performant, highly legible, well-documented and unit tested code.

Tags: Perl SFDC

WWW::SFDC - Metadata, Manifests and Zip Files

21 January 2015

In my last post I provided a brief introduction to WWW::SFDC, a Perl wrapper around Salesforce’s API interfaces. Today I thought I’d look slightly more in-depth at WWW::SFDC::Metadata, WWW::SFDC::Manifest, and WWW::SFDC::Zip, and how they interact with each other.

The motivation for including the Manifest and Zip modules in addition to the core API interaction modules was that even after a mechanism is provided for interfacing with the metadata API, the fact remains that deploy and retrieve accept and return .zip files as base64-encoded strings, and deploy requires a package.xml file which is populated according to a stringent set of rules, and so on: writing the logic around these has consumed a large proportion of my time as a tooling developer, and will be the same for every use case, so it makes sense to tie these in to the API interface.

The idea behind the Manifest object is to make it extremely easy to juggle lists of files, package.xml manifest files, and the inputs and outputs of the metadata API calls. For instance, the simplest way to generate empty manifest files is now:


If we want to generate a custom package based on a list of modified files can look something like this:

my @filesToAdd = `git diff --name-only release master`;

The last major problem that this package solves is the generation of a list of files, given a package, bearing in mind that certain metadata types require -meta.xml files, and others are in folders. To build a valid .zip file for deployment, you need to know exactly which files to include, and you can do this thus:


This object then plays extremely well with the metadata API functions. If you want to retrieve every file in your org, you’d normally need to write out a package.xml including every document and email template you cared about. With the listMetadata call, you can just list the folders you care about, and you can chain this call with the appropriate manifest methods, to get extremely powerful one-liners such as:

        {type => 'Document', folder => 'Developer_Documents'},
        {type => 'Document', folder => 'Documents'},
        {type => 'Document', folder => 'Invoices'},
        {type => 'Document', folder => 'Lead_Images'},
        {type => 'Document', folder => 'Logos'},
        {type => 'Document', folder => 'Tab_Images'},
        {type => 'EmailTemplate', folder => 'Asset'},
        {type => 'EmailTemplate', folder => 'Contact_User'},
        {type => 'EmailTemplate', folder => 'Error_Reporting'},
        {type => 'EmailTemplate', folder => 'Marketing_Templates'},
        {type => 'EmailTemplate', folder => 'Support_Templates'},
        {type => 'Report', folder => 'Merge_Reports'},
        {type => 'Report', folder => 'Finance_Reports'},

And once you’ve done this, you can deploy them all again like this:

    singlePackage => 'true',
    $IS_VALIDATE ? checkOnly => 'true' : (),
    rollbackOnError => 'true',

Of course, if you dynamically regenerate your package.xml, you probably won’t check it into source control, and you don’t want to take it as the truth when working out what to deploy. I actually construct my zip file like this:

  WWW::SFDC::Manifest->new()->addList(`git ls-files src/`)->getFileList(),

One final element of note is that WWW::SFDC::Zip::unzip accepts an optional third parameter: a function reference, applied to each file retrieved before being written to disk. I use this to achieve profile compression (see my recent post on that topic) like this:

sub _compressProfile {
  my $content = shift;
  my @lines = split /^/, $content;
  for (@lines) {
    s/\r//g;			            # remove all CR characters
    s/\t/    /g;		          # replace all tabs with 4 spaces
    if (/^\s/) {		          # ignore the the xml root node
      s/\n//;                 # remove newlines
      s/^    (?=<(?!\/))/\n/;	# insert newlines where appropriate
      s/^(    )+//;		        # trim remaining whitespace
  return join "", @lines;

sub _retrieveTimeMetadataChanges {
  my ($path, $content) = @_;
  $content = _compressProfile $content if $path =~ /\.profile|\.permissionset/;
  return $content


I think that covers the main uses for those modules, and to those like me who have been grappling for months with the quirks of the ant deployment tool, the benefits of using a real programming language to achieve these tasks with minimum fuss are really obvious.

Tags: Perl SFDC

Introducing WWW::SFDC - a perl module for interactions

19 January 2015

I’ve been working with the ‘ migration tool’ - an ant wrapper around the Metadata API - for almost a year now, and I’ve been getting increasingly more frustrated with attempting to coerce ant (a tool for compiling java code) to be a deployment, retrieval, and git integration controller. It just isn’t meant for that purpose!

This means that when I have a list of EmailTemplate folders and want to retrieve all the templates within them, I have to spend 200 lines of xml to:

  • use some inline javascript to get the login credentials from a file depending which environment is specified
  • call a cSharp executable to ascertain whether the login credentials are valid, otherwise the user will get frozen out of because of too many failed login attempts
  • call listMetadata once for each folder, which outputs the results to a file
  • call a perl script to parse those files
  • call another perl script to write the manifest file
  • call retrieve (finally!)

This is a terrible way to operate for any organisation moderately invested in getting a proper handle on SFDC metadata, and is also really difficult to test in any meaningful manner.

The issues I was having with ant inspired me to write a new perl module, WWW::SFDC, for interacting with APIs. The aim is to provide Do-What-I-Mean wrappers around all of the SFDC APIs, such that they all work together and achieving moderately complicated interactions is as quick and easy as possible. The module is only about 40% complete, if you consider ‘complete’ to mean ‘has a wrapper around every function in the Partner, Metadata and Tooling API and is unit tested’, but it’s got enough to achieve some seriously useful stuff. For instance, the above-mentioned retrieve task can now be accomplished like this:

    password  => $password,
    username  => $username,
    url       => $url

				{type => 'Document', folder => 'Apps'},
				{type => 'Document', folder => 'Developer_Documents'},
				{type => 'Document', folder => 'Documents'},
				{type => 'EmailTemplate', folder => 'Sales_Templates'},
				{type => 'Report', folder => 'Merge_Reports'},

…which I think is pretty neat. There are some basic examples in, and I intend to make some blog posts demonstrating things that I already use this module for. I’m not planning to release in on CPAN until it’s more complete, but it is installable from git.

Note that it also currently has a dependency on Logging::Trivial: I’m not sure what I’m going to do with this, since it’s difficult to differentiate Logging::Trivial amongst all the other logging modules, but it is my favourite! I may eventually rename Logging::Trivial, or change the logging engine to Logging::Minimal.

Tags: SFDC Perl