Articles

A collection of things I’ve written.

Installing Jenkins as Continuous Integration builder on Windows

5 min read

While most of my work is Mac/Unix based some of my business work is built on VB.NET/ASP.NET inside a Windows development. While striving for more robust, predictable code I chose to implement a Continuous Integration (CI) system. I had already used CI in large scale game production environments with great success following the TDD model.

Setting up a Continuous Integration server was the 1st task and I chose Jenkins for the task in hand. This had to be setup on a Windows system so that it could invoke Visual Studio 2010 in order to build the required projects parts that were a mix of VB applications and ASP.NET web services. We also had a requirement to use Perforce as the SCM of choice.

I’m assuming you already have a working build of your product on a host machine with your VS2010 build already cleanly building via a solution file.

Here’s how I did it:

Install Java Runtime Edition (JRE)

Jenkins requires Java so you’re going to have to ensure you have that installed before you can go much further. Oracle maintain the JRE so go there now and pop back when you’ve got it installed. http://www.oracle.com/technetwork/java/javase/downloads/index.html

Download and install Jenkins

Once again, there’s a nice handy page over at Jenkins HQ that includes all the info you need on how to get Jenkins installed. Don’t forget to come back for more settings.

https://wiki.jenkins-ci.org/display/JENKINS/Meet+Jenkins#MeetJenkins-Installation

Jenkins Setup

Within our organisation we typically keep all of our development files in a common directory regardless of host machine that maps straight into the Perforce Depot at a consistent point. This ensures we can get developers up & running very quickly. In our case, it’s:

C:\dev

Firstly, we’ll relocate Jenkins into that directory so we can maintain it inside our Perforce Depot. So locate your downloaded & unpacked ‘jenkins.war’ and move it to:

C:\dev\Build\jenkins\jenkins.war

To test our setup, we’re going to start a static server via the command prompt so we can see all the various messages it produces. This will help us spot errors and get to know how Jenkins works. Inside your Command Prompt issue:

java -jar C:\dev\Build\jenkins\jenkins.war

If this all looks good, then check the Jenkins web service is running by visiting the default web browser location http://localhost:8080/. Remember this is a localhost setting so it’ll only work if Jenkins is running properly. We’ll be running Jenkins as a Windows Service once we’re happy everything is running OK and we can trust it to work cleanly in the background and startup when the system starts.

Install MSBuild plugin for Jenkins

locally: https://wiki.jenkins-ci.org/display/JENKINS/MSBuild+Plugin . You’ll notice there’s no download link on there as plugins are installed through the local Jenkins browser interface. {style=”color: #333333; font-style: normal;”}

Inside the Jenkins browser you’ll be installing the MSBuild plugin via Jenkins>Manage Plugins>Available then you’ll find the MSBuild in the ‘Build Tools’ section.

NOTE: This page should be populated with available plugins, however if you’ve just installed Jenkins and got to this point quickly then it’s likely that the list will be empty. If so then try 2 things. (a) wait for the repository to download but we warned there’s no indicators for this (b) restart Jenkins service to populate the list.

Configure location of MSBuild.exe

Go to your local Jenkins Configuration via http://localhost:8080/configure Navigate down that page to MSBuild section click ‘Add MSBuild

  • Name: .NET 4.0
  • Path : C:\Windows\Microsoft.NET\Framework\v4.0.30319
  • Default Params: empty

You should now have the prerequisites to be able to make a build Job via Jenkins.

Build Job

Add a new Build Job

New Job http://localhost:8080/view/All/newJob – Job Name: ProjectNameType: Build a free-style software project – OK

Setup your Job to execute a MSBuild script: Navigate down through Build -> Add Build Step -> Build a Visual Studio projectMsBuild Version: .NET 4.0 – MsBuild Build File: C:\dev\Projects\ProjectName\Project.vbproj

That’s it for the basic build. You should now be able to get Jenkins to run your build job and have it build locally without errors. If it’s not working right, sort it out now before adding in SCM.

Continuous Integration via Perforce

Now you’ve got a working build (if not then read up and come back) it’s time to connect your local build to the Perforce depot so we can get your build system automatically syncing & building your project when anything changes in Perforce.

Install Perforce plugin

Install Perforce plugin for Jenkins in a similar way to how we installed the MSBuild plugin via Jenkins -> Manage Plugins -> Available

Configure Perforce

We’re now going to configure the default Perforce command for the whole of Jenkins.

Go to http://localhost:8080/configure and navigate down to Perforce section of the configuration click ‘Add Perforce’ – Name: p4 – Path: C:\Program Files\Perforce\p4.exe

The system wide default Perforce command is now installed and ready for connection to a Job.

Add Perforce to Job

Next we’re going to join the command into the job so they execute when appropriate.

Go to the job configuration page for the Job you’re setting up and locate ‘Source Code Management’ and select ‘Perforce’ — P4PORT: your perforce server IP:1666 — username: your perforce usernameworkspace: jenkins

Ensure the perforce workspace/view mappings are accurate inside the Job -> Perforce settings. A good way is to copy the view from a working p4 workspace inside the p4v application itself and replace the original workspace name with ‘jenkins’

Check Perforce Works

 It's now time to see if you're done everything right. Starting a Build for the Job inside Jenkins should work smoothly now. But we're not done yet, we've still got to setup the triggers to complete the CI process.

Poll SCM

So now you’ve got a working installation of Jenkins, you’ve got your build working cleanly and it now Integrates your project from Perforce when you initiate a build. It’s all good but it’s not very Continuous. To make this work we’re going to get Jenkins to poll Perforce every 1 minute for changes and then automatically start the Job we already setup. Find the setup portion inside Job -> Configure -> Build Triggers – [x] Poll SCM (checked) – Schedule :

# every minute
* * * * *

Done

Well, that’s it. You’ve now built yourself a Continuous Integration server to watch over you and make sure your builds are consistently building and you’ve not forgotten to add something to the repository, submit a complete working build or a myriad of other reasons.

You may want to install some desktop notifications for clients, setup some NUnit tests and really start building on the good foundations you’ve setup.

app list

Day One.app Disk Inventory X.app

Dropbox.app Skype.app Day-O.app Flux.app Hazel KeyRemap4 MacBook MenuMeters Transmission.app Twitter.app

Perian VLC.app HandBrake.app Flip4Mac Google Chrome.app

CodeKit.app DiffMerge.app MAMP PRO MacGDBp.app Sequel Pro.app Sublime Text 2.app Tower.app VMware Fusion.app Xcode.app

iA Writer.app nvALT.app

Setting Up Osx For Web Development 2

3 min read


title: Setting up OSX for Web Development date: 2012-04-11 16:57:46

type: post

(I will be maintaining this Post with my current setup steps as time moves on)

I’ve been using OSX for Web Development for a while now, initially on a Mac Mini, then migrating to a MacBook Pro and on to a MacBook Air. I thought it was a good idea to get my web development software setup down so I can recall it more easily and get things moving a lot more quickly when I need to re-set everything back up again.

Read on to find out how.

Main apps

After starting from a fresh OSX Install I usually follow up with a few default web development applications that I have stored on my local server under ‘app_0’ to indicate that they’re the critical.

  • XCode - install via the App Store. While it’s not used for web development it is required as it includes vital system tools that others depend on, such as PEAR, Brew/MacPorts. It’s a big old download so it’s worth starting early and make sure you’ve got some elbow room on your locale HD.
  • Google Chrome - my browser of choice, I use the regular suite of Firefox, Safari and some VMWare Fusion Virtual Machines for testing
  • MAMP Pro - the nuts and bolts Mac Apache, MySQL and PHP setup
  • Sublime Text 2 - an awesome text editor. I was using the ubiquitous TextMate  for a long time but I shifted over to ST2 and I can honestly say it’s awesome. More on my configuration later.
  • Sequel Pro - a useful interface into MySQL when the default phpMyAdmin isn’t enough. It has a very handy ‘Optimise Type’ that I’ll cover.
  • Tower - an GIT visual client that’s got some amazing features when you fancy moving away from the Terminal
  • CodeKit - I use this as a continuous integration SASS (scss) & Compass compiler for my web projects.

Install XCode

then install the command line tools by running XCode then via

Preferences->Downloads->Command Line Tools->(Install)

Stop losing disk space

Prevent disk space erosion and incorrectly reported free space by disabling Time Machine local snapshots with the following command:

sudo tmutil disablelocal f

Fix Terminal

OSX Terminal has a nasty habit of dropping characters over SSH and I’ve hunted down a Preferences change that I’ve found to really help fix missing characters over SSH.

Inside Terminal Preferences:

-> Advanced -> Declare Terminal As: (xterm-color)

Setup PEAR

PEAR is used as to install and manage some PHP tools so it’s worth setting up early. Reference: http://blog.borntocode.com/2011/03/complete-php-dev-environment/

ln -s /Applications/MAMP/bin/php/php5.2.17 /Applications/MAMP/bin/php5

This final ‘mv’ operation fixes a broken default PEAR installation within MAMP

mv /Applications/MAMP/bin/php5/conf/pear.conf /Applications/MAMP/bin/php5/conf/pear.confg.backup

nano ~/.profile

add export PATH=/Applications/MAMP/bin/php5/bin:$PATH to ~/.profile

sudo /Applications/MAMP/bin/php5/bin/pear channel-update pear.php.net
sudo /Applications/MAMP/bin/php5/bin/pear upgrade pear
/Applications/MAMP/bin/php5/bin/pear -V

check versions are running

sudo /Applications/MAMP/bin/php5/bin/pear channel-discover pear.phpunit.de
sudo /Applications/MAMP/bin/php5/bin/pear channel-discover components.ez.no
sudo /Applications/MAMP/bin/php5/bin/pear channel-discover pear.symfony-project.com

PHPUnit - testing framework

PHPUnit is a unit test framework that allows you to run automated tests on your code to ensure it’s consistently robust and error free. Unit Testing is a fundamental part of Test Driven Development

sudo /Applications/MAMP/bin/php5/bin/pear install phpunit/PHPUnit

Phing - build tool

Phing is a build tool that can save you a whole load of grief by running unit tests (PHPUnit), syntax checks (lint), deployment scripts and much more. All of this automatically checks your development is pretty solid before it gets published.

sudo /Applications/MAMP/bin/php5/bin/pear channel-discover pear.phing.info
sudo /Applications/MAMP/bin/php5/bin/pear install phing/phing

I'll post more on the specific configuration later

Install DiffMerge

via http://sourcegear.com/diffmerge/downloads.php

Install GIT

GitHub maintains a great page on Set Up Git for Mac, which you should follow before returning here. We’ll be using GIT for semi-automated deployment to your production servers later.

Local Terminal Configuration

Now you have all of the necessary files installed you can go ahead and start configuring your local system.

I keep a GIT repository of all of my hidden ‘dotfiles’ at the link below that you can use as a great starting point for your own config

cd ~/Projects
git clone git@github.com:gamelinchpin/dotfiles.git
cd dotfiles
./bootstrap.sh

Restart Terminal and all the relevant config files should be in the right places.

Summary

Hopefully you’ll be starting to get a decent setup and the following posts should help solidify a nice web development environment.

Game Development Post Production - SHU GSPM S2W2

Below is a link to the presentation I gave at Sheffield Hallam University for the BSc (Honours) Games Software Development module Game Software Project Management about Post-Production.

**[The Post-Production Cycle - Game Software Project Management](http://www.slideshare.net/gamelinchpin/the-postproduction-cycle-game-software-project-management "The Post-Production Cycle - Game Software Project Management")** from **[gamelinchpin](http://www.slideshare.net/gamelinchpin)**

presentation

Optimising an eCommerce Site

4 min read

Our development work this week has included a lot of server optimisation for an eCommerce website and we dramatically improved response times, lowered overall CPU use and memory use by making changes to the server configuration.

The benefit is a better experience for the Visitors, Customers and Google (page load speed is a factor in Page Rank) and it also enables the server to cope with higher volumes of  visitors without becoming overloaded.

All of the changes were made to the configuration of the server itself, no changes were made to the website to gain these benefits.

Memory

We started by improving memory use, as this was frequently consumed and caused the site to become unresponsive and on rare occasions it become unusable.

The main changes came from changes to the Apache, MySQL and PHP configurations to reduce peak memory use to something the server could cope with.

MySQL

The first thing to do was analyse the MySQL peak use, which is a complex affair but our tools enabled us to quickly discover what the peak memory requirement is and enables us to identify areas we can safely change.

There were multiple changes made to the MySQL configuration that brought the peak memory requirement down from 18Gb to 6Gb, which ensured that MySQL remained in RAM and didn’t get swapped out to disk all of the time and that is very bad.

Apache & PHP

Apache was tweaked to cope with the regular demands by modifying the number of servers it keeps running and the memory they each consume. The changes themselves were small but they are multiplied by the number of servers that are created so the peak memory usage drops significantly.

PHP was quite an easy one to fix, this was a simple case of dropping the memory used per instance and there was one of these per Apache server so even small changes get multiplied to save lots of memory.

CPU Use

Having monitored the server for some time there were peaks in the load that corresponded with some MySQL queries being ran, we investigated further and  identified some particularly slow, and CPU intensive, queries by setting up and monitoring the slow_queries log.

MySQL

The CPU optimisation came by setting up the correct MySQL indexes to match the queries being ran, in a number of cases this dropped queries that were taking above 40s (yes! 40s) down to 0s (instant). There was a single query that was taking anywhere between 85s and 240s to complete that was painful to watch but thankfully this was just from the Admin interface and was ultimately brought down to 0s with correct indexes.

XCache - PHP caching

To simplify things, PHP is a text script that is interpreted every time it is loaded, executed to generate the page and then discarded. This is repeated every time the page is requested and this all takes time.

With this in mind, the way to improve response times is to use a PHP opcode cache, in our instance called XCache. This is an Opcode Cache that stores the interpreted PHP scripts and a way that is very fast to execute and generate pages. This in turn reduces CPU load and also improves page response times as the pages are served much quicker.

We installed, configured and monitor the installation of XCache and at the time of writing the cache is serving over 80% of pages from its cache.

Page Caching - Not sending content

Page Caching is the means that your browser stores files locally and doesn’t download files from the server every time it wants it.

The dilemma is: long expire times mean fewer hits on the site *but* this isn’t great if you’ve got frequent content updates as you want the new content to be sent to the visitor.

The improvement here was to get a correctly configure the expiration times for numerous different file types. E.g., dynamic webpages expire within 1hr, the product images expires after 1yr. Once an item expires, the browser will refetch the original.

Bad Robots - stopping leeches

There are lots of computers out there looking to steal your content, hogging your bandwidth, crawling through your site and generating page requests that don’t bring you any business at all.

There is a ‘nice’ way to request they don’t crawl through your content by using a file called ROBOTS.TXT but it’s not a requirement that servers use it and most malicious ones simply ignore it.

There is a way to block the robots from every connecting with your site that’s quite simple to do and we have configurations that block over 75 known bad computers and prioritise genuine visitors and good search engines like Google, Bing and Yahoo.

Summary

Overall, the changes highlighted above along with some other smaller tweaks enabled the server to cope with a higher volume of visitors and also improve the response times for each customer.

These are general principles for well configured servers that can be applied to pretty much any website without the need to change the website itself.

Contact us if you’d like us to improve your visitors experience or maybe you think your existing server just isn’t up to scratch.

Move your Business to the Cloud

4 min read

Ring Alpha recently undertook the task of setting up and migrating a complete business over to the cloud. The task included migrating all of the User Accounts, Email, Documents, Calendars and Contacts to Google Applications for Business. If you’re thinking about improving your availability, cost savings and ensuring your documents are backed up and shared then give Ring Alpha a call to see how we can help you migrate

Google Applications represents significant benefits over hosting your own services and here are a few key reasons:

### Proven cost savings Google's web-based messaging and collaboration apps require no hardware or software and need minimal administration, creating tremendous time and cost savings for businesses. End users can use the familiar Microsoft Outlook interface for email, contacts and calendar as they transition to Gmail and Google Calendar. A leading research firm found that Google Apps is as little as 1/3 the total cost of competing solutions. Want to see how much you could save with Google Apps compared to Microsoft Exchange 2007? ### 50 times more storage than the industry average Each employee gets 25 GB for email storage, so they can keep important messages and find them instantly with built-in Google search. Gmail is designed so employees can spend less time managing their inboxes, and more time being productive. Time-saving features like message threading, message labels, fast message search and powerful spam filtering help employees work efficiently with high volumes of email. ### Mobile email, calendar and IM access With several options for accessing their information while on the go, employees can be productive with Google Apps even when they're not at their desks. At no extra charge, Google Apps supports over-the-air mobile access on BlackBerry devices, the iPhone, Windows Mobile, Android and many less powerful phones. ### 99.9% uptime reliability guarantee with synchronous replication We guarantee that Google Apps will be available at least 99.9% of the time, so your employees are more productive and so you can worry less about system downtime.** With synchronous replication, your data and activity in Gmail, Google Calendar, Google Docs and Google Sites is simultaneously preserved in multiple secure data centers. If one data center is unable to serve your requests, the system is designed to instantly fall back to another data center that can serve your account with no interruption in service. The Radicati Group found that Microsoft Exchange typically has 60 minutes of unplanned downtime per month. Google Apps customers typically experience less than 15 minutes of downtime per month. ### Information security and compliance When you trust your company's information to Google, you can be confident that your critical information is safe and secure. Google's information security team, including some of the world’s foremost experts in information, application and network security, are focused on keeping your information safe. Google and many other customers trust this system with highly sensitive corporate data. Businesses get these customizable security features with Google Apps: - Custom spam and inbound mail filtering tools, powered by Postini, to complement powerful spam filters that automatically work with no up-front configuration. - Custom outbound mail filtering tools to prevent sensitive information from being distributed, powered by Postini. - Custom information sharing rules to determine how broadly employees are allowed to share with Google Docs, Google Calendar and Google Sites. - Custom password length requirements and visual strength indicators to help employees pick secure passwords. - Enforced SSL connections with Google Apps to ensure secure HTTPS access. - Optional email archiving, up to 10 years of retention. ### Full administrative and data control Administrators can deeply customize Google Apps to meet their technical, branding and business requirements. Integration options let you connect Google Apps to your existing IT infrastructure. - Single sign-on API connects Google Apps to your existing authentication system. - User provisioning utility and API connect Google Apps to your existing user directory system. - Email routing and email gateway support let you run Google Apps alongside an existing email solution. - Email migration utility and API let you bring mail from your existing email solution into Google Apps. System branding and data ownership give Google Apps your look and feel, and ensures customer ownership of employee data. - Custom user accounts on your compay's internet domain. - Custom logo and colors in the applications. - Contractual customer ownership of employee data. ### Helpful 24/7 customer support Google Apps is highly reliable and easy to operate, but support is available for administrators should you need it. Support options include: - Phone support for critical issues - Email support - Self-service online support Google Apps also has a deep network of partners ready to help businesses with deployment, data migration, user training, system integration and custom application development. \*\* The 99.9% uptime SLA for Google Apps is offered to organizations using Google Apps for Business, as described in the Google Apps for Business Terms of Service.

Live Customer Support, chat and monitoring of your website visitors

2 min read

[![](/assets/125x125-go-chat-with.png)](http://www.olark.com/?r=tcg0p7ld "Olark live chat")

Live Customer Support and Realtime Information about your website visitors is absolutely critical to improving your business online.

Add Live Chat/Support to your website TODAY for ~~£100~~ **only £50**!    [Contact Us](/contact) to get started straight away.

UPDATE: Our clients with a small number of visitors are using this tool to chat to all of their customers while they’re in a buying mood.

Understand who’s on your site, make your customers love you, and earn more money.

  • Want to know who’s on your website RIGHT NOW?
  • Want to catch your customers before they abandon their basket?

Chat to your customers live while they’re on your website and help them find what they’re looking for with a great online support tool like Olark.

Integration is easy

We recently integrated this into our own website and were so impressed with the results that we began to encourage our existing clients to take this up.

We simply added the code snippet to our webpage and then connected it to our desktop client - Pidgin for Windows or Adium for Mac.

Once installed, your website visitors appear as chat clients right inside your desktop chat client along with basic information about that particular user. live customer support On the website, they see that you’re ‘live’ and they can choose to chat with you if they need help and you can also choose to start chatting to them.

Realtime Chat

IMMEDIATE benefits

Since installing it, we’ve been able to catch multiple customers before they walked away and we’ve genuinely got business we wouldn’t have had, all because of the live customer support that Olark offers.

Our major eCommerce client installed it too and quickly caught orders and helped their customers while they were still shopping, those customers would have walked away without live customer support. Live Customer Chat paid for itself very quickly.

Olark is our recommended choice of real-time customer chat because it can be extended to capture more business information you can use to really drill down into those opportunities that may go away without live customer support.

[![](/assets/125x125-live-chat-blue.png)](http://www.olark.com/?r=tcg0p7ld "Olark live chat")

So far, we’ve enhanced this with:

  • known user names - find out who’s registered and who hasn’t
  • current basket value - focus on the customers who have a large amount of value in their basket
  • page priority - catch users who are on your Contact or Checkout pages

Comprehensive Features {style=”clear:both;”}

The realtime chat and customer support software is great whether you’re on your own or if you have a team of support people.

The software also generates lots of useful information about your online chats and visitors, as well as pushing more data out to Google Analytics so you can really drill down into the detail.

Want Real-Time Chat on your website?

We can help you setup your website and desktop client to enable you to catch real-time visitors on your website and improve your chance of converting those visitors into customers. Contact us to find out how I can honestly say that we are genuinely impressed with Olark Live Customer Support and so our customers, especially as it brought them real orders within hours of being installed.

Is at desk internet access bad for productivity?

I know a few development companies that do not have internet access at the staff’s desk. Internet access is restricted to separate systems that are shared between multiple people and are in public areas so people can see who’s on and when.

One such company also has a ban on external calls at work, via the work system or your own personal mobile.

I know that these are justified as being for productivity reasons, staff have their distractions removed and live in a vacuum that consists only of known work and known processes.

Aside from security issues, is this really necessary? I appreciate that some people simply cannot resist procrastinating and checking up on their social networks every 1 minute but I’d also argue that at desk Internet access is as important as having electricity to your workstation. We know that it gives us access to an almost infinite array of knowledge, reference, learning, inspiration and also connections to people who can help us solve problems quickly.

Why limit the growth of everyone on your team because 1 person can resist Facebook? How and when do people find time to learn and bring new ideas to work?

I’ve ran a studio and I’m aware of all of the implications of both sides of the story but I believe there’s a compromise that’s easy and almost free to implement. Put a transparent proxy between the internal and external networks and tighten it up bit by bit, but only for those who can’t control themselves.

Generate a real-time report of who’s online, this weeks top downloaders, this weeks top page hits and share it on your intranet. It works wonders and you’ll see the balance restored between having the freedom of at desk access to a sea of knowledge and the few who just cannot control themselves and need to be brought into line.