Skip Navigation

Thought leadership from SAI to accelerate your performance
 

Systems Alliance Blog

Opinion, advice and commentary on IT and business issues from SAI
Date: Jul 2016

In recent years, website designs have become more complex and multimedia intensive. High resolution Retina-ready hero images, video, parallax scrolling, animations, social media widgets and a plethora of other tools have been incorporated to create engaging and immersive online experiences. The goals are certainly noble – to create unique and engaging websites for our clients that will help them achieve their online marketing objectives. Significant improvements in bandwidth - both wired and cellular - and device capabilities have further encouraged us to push the envelope with highly dynamic online experiences.

The evolution of technology and design patterns has resulted in our industry going a bit overboard, too often building slow, bloated web pages to achieve this interactivity.  While unheard of just several years ago, it’s now common for web pages to be several megabytes in size, an issue that has led to a negative impact on both our site visitor’s experience and our results (leads, conversions, sales, etc.). Slow website performance has been shown to impact both the abandonment rate of pages/sites and conversion rates. While studies differ as to how much, the consistent bottom line is that faster is better: faster load times correlate with more conversions and lower bounce rates.

It’s not just a user experience issue. Search engine optimization (SEO) is also an important consideration. As a reminder, Google has used website performance as a search ranking factor since 2010. Although it’s unclear how significant of a role performance plays in that algorithm today, any way we can optimize for Google’s algorithm matters.

Speed Matters

Achieving Results

Ultimately, our primary goal is to design and build websites that achieve marketing and business objectives, not just win awards (although that’s nice, too). I’m not suggesting that we step back and build sites that are flat or generic, but rather I’m advocating for striking the appropriate balance among performance, interactivity and visual flair. That’s not always easy. As our clients seek distinctive and engaging websites, our designers want to express their creative freedom and developers must execute on those design visions to create interactive, yet high performing websites. So the key question is: What can we do to effectively balance these sometimes competing goals?

First, what do we mean by a high performance website?

There are two key dimensions to a “high performance website”: true performance and perceived performance. When we say true performance, we’re referring to objective metrics based on the total size of the page, available bandwidth and code complexity, which then results in measuring how quickly the browser begins downloading and completes rendering the page. You can easily test true performance of your pages using available browser plugins and various online tools.

Perceived performance is a bit more subjective; it’s how quick a visitor thinks your page or site is; so the distinction here is appearance vs. reality. What we’re referring to by perceived performance is how quickly the visitor can see and interact with your content. So even if the browser is still loading code in the background or working on rendering components lower down the page below the fold, the visitor perceives that the page is fully loaded and the site is fast because he or she can immediately engage with it. Usability testing your site with target audiences is one of the most effective ways to test perceived performance.

So what’s really important – true or perceived performance?

Both are, but for different reasons. When it comes to Google’s analysis of a page’s or site’s speed, it’s all about real performance. The Google spider is identifying page size and download speed. So to maximize SEO for your site, you do need to pay attention to real performance. But when it comes to usability, it’s all about perceived performance. A site visitor only cares about what they see on the page and how they can interact with it. If there’s code or content that’s still downloading in the background or out of the visitor’s view, it doesn’t impact them; they’re not even aware of it.

Applying this knowledge to development

Applying this to your website development projects

Let’s shift gears from the theoretical to how we can use this information to build better, higher performing websites.

Establish a performance “budget” for the site

The best way to stay on track is to establish goals that everyone can agree on before design and development begin. Identify maximum or average page size values for your website as well as maximum download times on common wired/Wi-Fi and cellular connections. You may ask what those maximums should be, but there’s not a one-size-fits-all answer. It really depends on the industry and objective of the website. For example, a website for a museum will have different performance targets than a transactional ecommerce site.

Collaborate

One of the most effective means of ensuring that a website is both engaging and performing well is close collaboration between designers and developers. That has become easier in recent years with a shift to more agile development approaches. But the key here is for designers and developers to work together during the design process to ensure that the components and interactions being designed don’t have a detrimental effect on performance.

This project phase is really where you strike that balance between engaging design and site performance, addressing questions like “is that slider with complex animations really worth the performance impact, or can we achieve the user experience objective another way?”

Take advantage of perceived performance

We don’t want to hobble our designers’ creativity and we absolutely still want to push the envelope with exciting, original design concepts. So as we build web pages, we want to ensure that we prioritize both content and code. Using techniques like asynchronous loading, or ensuring that page elements above the fold fully load prior to elements further down the page, can help to ensure that site visitors can begin interacting with the page as quickly as possible. This is a bit of smoke and mirrors, but it certainly helps the user experience in unavoidably complex and code-heavy websites.   

PageSpeed Insights

The nitty gritty technical stuff

Getting back to true performance, which of course impacts perceived performance as well, you’ll want to make sure that you’re following general web development best practices. For instance –

  • Load code/content asynchronously and avoid render blocking components. This is a big one for perceived performance. Code your pages so the visitor doesn’t have to wait for all assets to load before being able to see and interact with the page.
  • Optimize/compress images to minimize page download size. This is more important now than ever as digital camera resolutions have increased and image sizes have gotten bigger. Optimize images to ensure size and resolution are no larger than needed for the particular application.
  • Leverage browser caching. Configure your site and web server to enable browser caching of image and code assets to speed up the experience for repeat visitors.
  • Make use of web server compression. This step reduces the size of the assets the site visitor’s browser needs to download.
  • Minify style sheets and JavaScript.  Stripping the white space out of JavaScript and CSS files help to make their download as small and efficient as possible.
  • Avoid content that requires plugins. It takes the browser additional time to load a plugin, and then more time to render that plugin’s content. These days, we can achieve pretty much everything we need by just using HTML5, CSS and JavaScript.

You can use a number of free online tools to performance test your site’s pages. A particularly useful one is Google’s PageSpeed Insights, which provides great information on factors impacting a web page’s performance and how to address specific performance issues.

Putting it all together

Circling back to our overall objective, we’re out to create outstanding user experiences for website visitors, across all devices, in order to help our clients achieve their digital marketing goals. Finding the appropriate balance between design and interactivity demands on one hand, and performance and usability on the other, is one key consideration to doing that. That doesn’t mean building boring websites, but rather making the right design and development trade-offs. Please drop me a note at mailto:mdabrowski@systemsalliance.com if you have questions or would like to discuss any of this further.

Non profits around the region are scrambling to address budgetary gaps caused by changes in labor laws. As leaders scramble to find solutions, overlooked opportunities may exist to cut operating costs and grow revenues through smarter application of information technology.

On July 1st, the minimum wage in Maryland increased by nearly 6% while in DC it jumped by almost 10%.  On June 27th, District took things a step further as the Mayor signed a bill to raise the city’s minimum wage to $15 an hour by 2020.  Similar legislation is expected to pass soon in the City of Baltimore.  These changes are being rapidly followed by another expensive policy change: New FLSA overtime rules will go into effect this December adding to the already heavy pressure on regional nonprofits’ budgets.

Unless nonprofit leaders find innovative ways to cover these substantial payroll cost increases, many will be forced to make tough decisions in the next few months. A recent article in the Baltimore Business Journal highlighted the potentially devastating effects that the wage increases could bring, with some fearing that organizations may be “forced to cut services, lay off workers or even shift locations”. 

This is obviously a nightmare scenario for many organizations and unless we are prepared to see diminished roles for important nonprofits around the region, action must be taken now to ensure these institutions can continue to serve the community.

One of the ways proactive leaders can get ahead of this coming fiscal crunch is by ensuring their organizations are running at peak efficiency.  For many that means a much closer look at their Information Technology portfolios including capabilities, budgets, and governance.

IT Capabilities

Information technology has grown tremendously powerful in the last few decades.  Staff are walking around today with more computing power in their pockets than the systems that guided astronauts to the moon.  Just because IT is powerful though, doesn’t mean it is being optimally deployed in an organization.

Nonprofits have traditionally lagged behind other industries in adopting new technologies.  Many hold on to systems long past what most corporate organizations would consider to be the useful service life.  This may have saved money by deferring replacement costs, but as these systems age, they bring other problems to light.  Support costs often increase over time as it becomes more and more difficult to find qualified staff to maintain systems and applications.  In addition, a lack of automation, an inability to integrate systems, and the emergency of inefficient processes that have grown up around out of date technology are all a drag on the efficiency of today’s nonprofits.

Beyond the hardware, software, and services being deployed, many organization aren’t able to maximize their existing IT investments due to gaps in their users’ knowledge.  Targeted training focused on process improvement, better and more approachable documentation, and an ongoing effort to grow knowledge should be a part of any IT planning initiative.

IT Budgeting

When it comes to budgeting, IT is often seen as a cost center, meaning that its budget should be reduced during lean times. This thinking is shortsighted and outdated. Approaches like taking an “across the board cut” in organizations often misfire.  After all, there are numerous examples where an increase in IT budgets drove substantial cost reductions elsewhere in the business.  If deploying new IT capabilities can deliver efficiencies elsewhere, how does cutting the IT budget make sense?

Similarly, attempting to align IT budgets with benchmarks from across the industry often delivers less than stellar results.  These numbers lack any sort of reflection to the organization’s structure, scale, capabilities, or mission.  Attempting to budget based solely on these numbers is therefore meaningless.

IT Governance

One of the least understood components of information technology is IT Governance.  Put simply, this is the method by which decisions about IT are made and executed.  Decision rights for IT go beyond the IT department or the CIO, and involve a broad base of stakeholders. There are two very common governance structures that both have substantial drawbacks for organizations undergoing change. 

The first is a decentralized, ad-hoc approach to IT. This is a weak form of governance where decisions are often made by individual users, managers, or departments.  A lack of standardization and planning has predictable results.  Systems and applications are often incompatible with one another and the costs to maintain the IT infrastructure is very high.

The second is more of a “dictatorship” model where a strong IT department dictates standards, deploys systems, and defines the future plans for the organization.  The gap here is that while the trains may run on time, they don’t necessarily go where users need them to.  The end users are often left wanting (sometimes they revolt) and IT can end up misaligned with the rest of the organization.

A better path is to strike a balance between these two approaches.  Certain aspects of IT should be managed by the IT department, but with input from users, leadership, and even outside advisors.  Other decisions should be left to others, with IT in a supporting role to enable their vision. Creating an effective governance structure allows organizations to maximize the utility of their IT investments and have control over its direction.

#TimeForAChange

As progressive reforms around compensation continue to sweep through the region, it is time for nonprofit leaders to prepare their organizations to meet the upcoming fiscal challenges.  IT planning and strategy should be a key part of that conversation.  If you’re ready to get started before things get rough, here’s a couple of options to consider:

  1. Move Infrastructure to the Cloud – This shifts your fixed costs to variable costs which can adjust to reflect the size and scope of your organization.  Reduced complexity of in house infrastructure also means you can potentially reduce your IT staffing needs. Significant discounts given to nonprofits by the major cloud players (Microsoft, Google, & Amazon) make this a very affordable proposition if you have the right team on your side to plan and manage the transition.
  2. Consider New Applications to Reduce Costs – IT is ubiquitous throughout your organization, but are you using it to cut back on costly administrative tasks?  Reducing overhead around policies and procedures can free up senior staff to focus on the mission instead of shuffling papers around.  Cutting training time means you can quickly train staff and reduce your onboarding costs.  This is going to be particularly important for high turnover roles that will be increasingly expensive to fill.  Having a solution that validates employee compliance and acknowledgement around policies can lower the risk of legal action.  All of these can be addressed with one solution: Acadia Performance Platform
  3. Revamp Your Web Presence – Does your website drive donors to you or is it a source of frustration?  Are visitors able to understand your mission and help support your goals?  Can your staff update and maintain it without having to jump through hoops?  If your website doesn’t match the professionalism of your organization, redesigning it can help bring you more success.

Mark Stirling is the Director of SAI’s IT Strategy and Operations practice and has worked closely with nonprofit clients including the Maryland Zoo in Baltimore. You can find more of his posts and other insights from SAI on the Systems Alliance Blog.

Calendar
Jul 2016
 12
3456789
10111213141516
17181920212223
24252627282930
31