Social Memo: Facebook Organic Reach Isn’t Dead – 5 Tips To Increase Your Reach!


Before we begin, if you missed last month’s social memo, give it a read as it lays the foundation for this next installment. Here we go!

Facebook Organic Reach is not dead. Here are 5 tips you can use to increase your reach. 

1) Stop producing promotional content.

If you’re currently filling your Facebook Page’s content calendar with coupons, direct sales, or highly promotional content then it’s time to stop.

2) Learn what your fans actually like.
If you are creating content just because you think you should, with no real strategy and no conception of what your fans will actually respond to, you’re doing it wrong. Page Managers need to periodically perform social audits to see what type of content is generating the most engagement.

Additionally, Page Managers have access to Facebook Audience Insights — a tool that gives great information on fans’ demographics, occupations, locations, and other relevant Facebook pages. Use these resources to your benefit and leverage the data to generate content that will better resonate with your fans.

3) Utilize Influencers
You need to start valuing the impact an Influencer can make on your social reach. Influencers are social media mavens that have created a strong following in both quantity and quality. Influencers can really add to your efforts towards expanding your social presence as that Influencer has already done so much work to create their following and develop trust with their fans. So, when they share your content those fans will at least give it a chance as it was pushed to them from a reliable source.

4) Make sure your fans are following you
Have all of your email subscribers liked your page? What about your Twitter followers? What about repeat customers? You are most likely missing out on some organic reach because not all of your fans are Facebook fans. Send out an email to your most loyal customers asking them to like you on Facebook. See if you get a increase in likes. If you do, you are increasing your potential organic reach!

5) Buy Ads to Increase Fan Affinity! 
Whoa whoa whoa! We are talking about organic reach, right? Right. So why am I telling you to go and spend money? Well, you need to understand how Facebook works. One of the key elements of Facebook’s Edge-rank algorithm is the principal of Affinity. This is essentially how connected a fan is to your page and content. If a fan is constantly liking or sharing your content they will have a higher affinity for your page and thus continue to see your content.

Lets say you haven’t done the best job pushing quality content to your page and/or you’ve been spending too much of your time publishing overly-promotional advertisements. This has created a lower affinity for your page. Now you’ve learned the errors of your ways and are ready to create some killer quality content (possibly based off the data you gained from learning about your audience, read: Tip 2). That “bad” content has done some serious damage to your page’s affinity score, so even though you are putting better content on your page, your fans may not see it. This is because of the amount of inactivity they have had towards your page’s previously poor content.

This is where a smart ad-buy can get you back in the game. By putting money behind your best content and targeting your audience, you can rebuild the affinity with your fans thus increasing your organic reach.

Swept up in the stream: 5 ways ISL used Gulp to improve our process


ISL recently switched our front-end build system from Grunt to Gulp. Our primary motivations for switching are Gulp’s preference for code over configuration, node streams, and asynchronous tasks by default. After modifying our existing build process to adhere to the Gulp way, we saw a drastic improvement in overall build speed, on the order of several seconds, and near instant rebuilds for live reloading during development.


By adopting Gulp we’re actually scripting and coding our build process, unlike Grunt where we merely configured tasks. For example, if you want to tap into a Gulp stream and transform a file using plain old vanilla JavaScript, you can. It enables developers to customize their build workflow much easier, and the streaming paradigm works well since we’re literally just dealing with text files.

Gulp streams mimic the paradigm of Unix pipelines and redirection that we love so much. Combining this with Gulp’s asynchronous task execution powers fast builds with less IO meaning fewer file writes and less time sunk into waiting for builds to finish. By contrast, although Grunt allows async task implementations, they require more configuration and code resulting in unnecessary overhead for the developer. Also, Grunt async tasks often require writing of intermediate files before passing control to a new module for transformations, resulting in more time spent writing files and less time actually performing transformations.

We did take advantage of run-sequence to better control task execution order (which will be baked into Gulp 4 with series and parallel), but we run all build tasks in parallel and only use this for pre and post-build tasks like cleaning and versioning.

While we probably could have accomplished these changes by refactoring our Grunt workflow, we decided that time would be better spent reevaluating the entire system.


How we switched

First, we converted our Gruntfile, task by task, over to Gulp tasks. We simplified a lot of things along the way, and began converting our projects to a new build process built on Gulp.

After our initial switch to Gulp things still felt a bit Grunty — it was almost a task-by-task replica. We recognized that we needed to refactor to the streaming paradigm to really take advantage of Gulp’s asynchronous task execution. We also wanted to extract the configuration out of the Gulpfile for a few reasons. Since deployment depends on a stable build process, we discourage edits to our base project Gulpfile. Instead, we abstracted all configuration out of the Gulpfile, and require it in a global config variable to be used by all Gulp tasks. This allows our developers to change settings on a project-by-project basis, while still being able to depend on reliable deployments.

Note, we follow a convention of suffixing each path we write in our settings with a forward slash. Following small conventions like these helps later when debugging, writing new tasks, and composing custom paths.

/** N.B. All paths end with / */
var minimist = require('minimist');
var knownOptions = {
  string: 'env',
  default: { env: process.env.NODE_ENV || 'local' }

var options = minimist(process.argv.slice(2), knownOptions);
var src = 'src/';
var dist = 'dist/';
var isCompressing = (options.env === 'staging' || options.env === 'production');
var isDebugging = (options.env !== 'staging' && options.env !== 'production');
var isStyleguiding = true; // change this to (options.env !== '<env>' [&&...]) to disable on specific environments

module.exports = {
  // environment
  compressing: isCompressing,
  debugging: isDebugging,
  styleguiding: isStyleguiding,

  // paths
  src: src,
  dist: dist,
  css: {
    glob: '**/*.css',
    dist: dist + 'css/',
    src: src + 'css/'
  js: {
    src: src + 'js/',
    dist: dist + 'js/',
    glob: '**/*.js'

  // modules
  autoprefixer: {
    browsers: ['> 1%', 'last 2 versions', 'Firefox ESR', 'Opera 12.1']
  bower: { // aka main-bower-files
    bowerDirectory: src + 'bower_components/'
  nodeSass: {
    errLogToConsole: true,
    includePaths: [
      src + 'bower_components/foundation/scss/'
    sourcemap: isDebugging

We also wanted to simplify the build process since we had a lot of extra tasks from the Gruntfile that made up a complex naming scheme that was difficult to follow. Instead of using different names for the same types of tasks — e.g. compile:js, compile:css, copy:texts, copy:fonts – we now have one task for each type of asset, indicating the type of asset they are responsible for outputting.  We ended up with  clear and concise task names that are obvious and easy to remember:

'clean', 'html', 'css', 'js', 'fonts', 'images', 'extras', 'styleguide', 'rev'

Each task is configured based on the settings file and can perform different operations on streams based on these settings. For example, the rev task, which post-processes assets, can optionally compress files based on the compressing variable in the config file.

Tricks that helped us out

Manipulating these text streams and organizing the assets into their respective directories and switching compressing and linting options wasn’t easy at first. Abstracting the configuration to a separate file helped with these manipulations and there are many other tricks that helped us out: filtering streams, conditionally operating on streams, sending streams through different pipe channels, joining streams, and asynchronously operating on multiple streams. Check out this gulp cheatsheet for a helpful visual guide to some of these.

Note, all gulp plugins are loaded with gulp-load-plugins. You can define any variable name to reference these plugins, and in the following code snippets we use the dollar sign, $, for this. So, no, that is not jQuery in our gulpfile!


Use when: You want to separate configuration from gulp tasks.

Since our configuration is a plain JavaScript object, we can flexibly make settings for all kinds of tasks:

var config = require('./config.js');
var $ = require('gulp-load-plugins')({ lazy: false });

gulp.task('html', function() {
  return gulp.src( [config.src + config.html.glob, '!' + config.bower.bowerDirectory + '**'] )
  .pipe( gulp.dest(config.dist) );

gulp.task('css', function () {
  .pipe( $.sourcemaps.init() )
  .pipe( $.sass(config.nodeSass) )
  .pipe( $.autoprefixer(config.autoprefixer) )
  .pipe( $.sourcemaps.write() )
  .pipe( gulp.dest(config.css.dist) );

Filtering streams

Use when: You need to operate on a subset of a stream based on a glob.

To focus using modules on a subset of files in a stream, we can use a couple of different modules to filter files from Gulp streams.

First up, gulp-if is a module that allows you to conditionally operate on files based on globs. An example of this in use is for filtering font files from all bower components:

return gulp.src( require('main-bower-files')({ paths: config.bower }), { read: false } )
.pipe( $.if(config.fonts.glob, gulp.dest(config.fonts.dist)) );

Next, we use gulp-filter to narrow the scope of a stream, perform a few operations, and return to the original stream’s state when we’re done. We use this to avoid linting vendor files in our source directory — for the inevitable script that we need to include manually for some reason:

var vendorFilter = $.filter(['*', '!' + config.js.src + 'vendor/**']);

return gulp.src( [config.js.src + config.js.glob] )
.pipe( vendorFilter )  // avoid hinting assets/js/vendor files
.pipe( $.jshint() )
.pipe( vendorFilter.restore() ) // restore assets/js/vendor files to be copied to dist
.pipe( gulp.dest(config.js.dist) );

Conditionally operate on streams

Use when: You need to perform operations based on a boolean.

Back up to the plate is gulp-if. Along with filtering by glob, this module is useful for conditionally operating on streams based on a boolean. Since we have a lot of booleans in our configuration variable, we use this all the time to transform our files in different ways based on environment. For example, we may not want source maps to be generated for production deploys, but we do for development. Or we may want to compress asset files only on production to make debugging easier on development:

return gulp.src( config.dist + '**/*' )
.pipe( $.if( config.compressing, compressChannel() ))

Sending streams through different pipe channels

Use when: You need to make an optional pipe to send a stream through.

If the above example looks incomplete, that’s because it’s missing context for what we’re doing with compressChannel. That’s where lazypipe comes in to play. We can save a Gulp stream to a variable using lazypipe in order to conditionally pipe our streams through later. This technique completes the above example to show how we only compress files during post-processing on production environments:

var lazypipe = require('lazypipe');

var compressChannel = lazypipe()
.pipe(function() {
  return $.if(config.js.glob, $.uglify());
.pipe(function() {
  return $.if(config.css.glob, $.minifyCss());
.pipe(function() {
  return $.if(config.html.glob, $.htmlmin(config.htmlmin));

return gulp.src( config.dist + '**/*' )
  .pipe( $.if( config.compressing, compressChannel() ))
  .pipe( gulp.dest(config.dist) );

Joining streams

Use when: You need to add source files to your stream or perform different actions on similar files.

Don’t always jump to using this trick. Usually you can get away with using globbing in gulp.src, and the streams are run synchronously (it is a queue after all), so this is rarely needed. However, we used this to join two separate source streams into a single stream to simplify our fonts task. Instead of using two different tasks to grab fonts included in our bower components and project specific fonts in our source directory, we used StreamQueue to grab the sources individually in one task:

var streamqueue = require('streamqueue');
return streamqueue({ objectMode: true },
  gulp.src( require('main-bower-files')({ paths: config.bower }), { read: false } ),
  gulp.src( [config.fonts.src + config.fonts.glob] )
).pipe( $.if(config.fonts.glob, gulp.dest(config.fonts.dist)) );)

Generally, StreamQueue would be better off being used for performing different actions on similar files. This would allow you to keep all of the operations for a type of file in one task, while still accomplishing everything needed for the build process. For example, if you had different sets of CSS files to operate on using different modules, a good way to use StreamQueue for this would be:

return streamqueue({ objectMode: true }, 
  gulp.src( './css/src/**/*.less' ) 
  .pipe( less() ), 
  gulp.src( './css/src/second.css' ) 
  .pipe( autoprefixer('last 2 versions') )

Asynchronously operating on multiple streams

Use when: You have multiple, (probably) different tasks to accomplish to complete a task.

Since one of our goals was to simplify the build process, keeping all related tasks under a single named task in our Gulpfile was helpful for achieving this. That said, it made it interesting to tackle building our CSS and JS files from different sources. For example, we use browserify for dependency management and bundling of JS files, where we don’t always want to include main bower vendor files that require shimming etc. Furthermore, we wanted to be explicit about which vendor files we were including, and the gulp-useref workflow inspired us a lot here. To build all of our CSS and JS files, we add our vendor files to an assets.json file defining what file names to write and which files to include in them:

  "css": {
    "assets/css/vendor.css": [
  "js": {
    "assets/js/vendor.js": [

With this, we can map the array of files to create an array of Gulp streams to be executed asynchronously, all while streaming:

 * Creates an array of gulp streams for concatenating an array of files
 * @param {object} files - key is destination filename, val is array of files to be concatenated
 * @param {string} dist - destination path
function vendorMap(files, dist) {
  var path = require('path');
  return Object.keys(files).map(function(distFile) {
    var srcFiles = files[distFile].map(function(file) {
      return config.src + file;

    return gulp.src(srcFiles)
    .pipe( $.concat(path.basename(distFile)) )
    .pipe( gulp.dest(dist + path.dirname(distFile) );

And finally, we concatenate these different streams using event-stream’s merge function:

var es  = require('event-stream');
var vendorFiles  = require('./assets.json'));

var appStream = gulp.src( [config.js.src + config.js.glob] )
.pipe( gulp.dest(config.js.dist) );

var vendorStream = vendorMap(vendorFiles.js, config.js.dist);

return es.merge.apply(null, vendorStream.concat(appStream));

In conclusion

Our new Gulp workflow is still maturing but has been happily building projects in production environments for a couple of months now. We’ve made a few changes along the way, but they have been small and quite easy to implement. For example, we ripped out gulp-ruby-sass and replaced it with gulp-sass for native libsass speed improvements. This change was nearly effortless since our config was abstracted out and our tasks have clear separation of concerns. All it took was renaming and adjusting the SASS options in our modules configuration, and adjusting the css task. We didn’t need to touch any paths or other tasks, it just worked.

This level of flexibility is not only desired for current projects, but also helps decrease the barrier to entry for new developers on or joining our team. We don’t have the burden of describing lines of custom code to new devs, instead we have a clear system that is easy to digest and approachable by everyone after a skim of the documentation we’ve written for these few high-level architecture decisions. Future modifications are not hindered by mountains of code changes, and the conventions we follow help simplify complicated infrastructure which lets our devs save time and focus on getting tasks done.

Now Accepting Applications for Design Interns!


iStrategyLabs is on the hunt for new and talented design interns! The position provides an agency-wide experience with a focus on design production for one of our biggest clients. Your day to day responsibilities, like assisting in studio photoshoots and set-ups, prop curation and acquisition, and social media content design, will amount to a standout resume. If you’re passionate about design and excited about learning the agency ropes, please send a portfolio and resume to our creative director, Zach Goodwin at We look forward to hearing from you!

The Perfect Candidate:

•  Knows Adobe Creative Suite like the back of their hand
•  Has Mac proficiency
•  Is comfortable using fancy Canon DSLRs (specifically Canon 5D Mark 3)
•  Has basic photo/video editing abilities
•  Is a cool person, you cannot be lame

Position Details:

•  3 month commitment
•  9am – 6pm, Monday – Friday
•  Weekly Stipend
•  School Credit available

Bonus Skills:

•  Studio photography
•  Illustration
•  Web/UX skills

William Grant & Sons Hires ISL as Agency of Record for Seven Brands



Over the past few years, ISL has grown and honed our service offerings to focus on the intersection of online and off with brands such as Miller Coors and Facebook while continuing to partner with top brands such as Kroger on all things digital and social. The intersection of these core services is our sweet spot (others agreed: check out our Small Agency of the Year award!).

When William Grant & Sons, an independent, family-owned Scottish company that distills Scotch whisky and other select categories of spirits, began a process last year to find a new Digital and Social Media Agency of Record, they found what they were looking for in ISL.

After a competitive review, we’re proud to announce that ISL is William Grant & Sons new Digital and Social Media Agency of Record, responsible for digital and social media activities for 7 of their portfolio brands in the US.

If you know ISL, you know that this couldn’t be a better fit! A run down of the global brands the ISL team will be working on out of our NYC and DC offices are as follow:



The Balvenie

The Balvenie

Tullamore D.E.W.


Hendrick’s Gin


Sailor Jerry Rum


Flor De Caña Rum


Milagro Tequila


Introducing The Social Memo: January 2015


Introducing the Social Memo a series of posts meant to cover any new tools, platform changes, or interesting data in Social Media. 

Facebook has made some major announcements in the past few months that are changing the way marketers manage their Facebook pages. Perhaps the most notable change is to their newsfeed algorithm, penalizing the organic reach of promotional posts. This decision is based off of the social network’s survey data, which reveled users wanting to see more content from friends and brand pages they “like” and less promotional content. According to Facebook there are three main traits that make an organic post appear promotional:

1. Posts that solely push people to buy a product or install an app

2. Posts that push people to enter promotions and sweepstakes with no real context

3. Posts that reuse the exact same content from Facebook Ads

The two ads below are examples, provided by Facebook, that host these qualities:



While this change is positioned as a benefit to users, which I believe it is, it is also clear that Facebook is sending a powerful message to marketers: if you want to advertise on Facebook then you are going to have to pay. That said, Facebook is  deliberate in stating that this move does not change how paid promotions work.

Note that the key word here is “advertise.” If you are simply trying to push your product, then you will need to pay for those advertisements, because that is exactly what they are…advertisements. It’s no different than having to pay for placement in a magazine, on a billboard, or on television. Facebook is not implementing anything detrimental to those brands that are producing quality social content for their fans. That means brands can still generate engagement on Facebook without having to pay for it — they just need to stop producing “advertisements” and start creating awesome content.