Retail – UK’s largest arts and crafts retailer
Salesforce Commerce Cloud, Mulesoft middleware Integration, Dynamics NAV (ERP), Java, Serenity BDD automation framework
Managed Service, Quality and Test Strategy, Functional, Accessibility, System Integration, API testing, Compatibility, Automation, and Performance Testing
Hobbycraft was founded in 1995, based in Bournemouth and has since grown to be a nationwide business with over 100 stores across the UK, ready to support and inspire an ever-expanding variety of crafts including knitting, crochet, haberdashery, papercraft, baking, jewellery making and more.
Hobbycraft wanted to migrate to a new eCommerce Platform and engaged Ten10 as a QA partner to manage the functional, integration, accessibility, performance and automation testing, and to work alongside their current development partners, including Astound Commerce and The Nav People (TNP). Hobbycraft had aggressive soft launch and go live timescales to deliver the project within twelve months. Ten10 were onboarded four months into the project after the initial project kickoff.
What we did
Ten10 successfully delivered a fully remote managed service to Hobbycraft over eight months which included the following areas of testing:
During the initial discovery phase of the engagement, Ten10 conducted a series of workshops to understand Hobbycraft’s current approach to quality assurance and the Salesforce Commerce Cloud platform delivery roadmap, context, priorities and risk profile.
The delivery manager, functional and non-functional specialists defined and created functional, automation and performance transactional volume model (TVM) test approaches based on business and solution context, technology stack, technical architecture and integrations, team structure, approach to quality risk mitigation, release strategy across all parties, and a defect management and reporting approach.
The test documents were shared with Hobbycraft and various third parties to review and approve the approach to testing. This was alongside some delivery and quality improvement recommendations to ensure Ten10 and Astound (the development partner) were aligned throughout the software delivery lifecycle.
Several checkpoints were be scheduled across the engagement, the first being scheduled at the end of the scoping phase for Ten10 and Hobbycraft to review progress, validate the scope and approach, tools, test estimates and confirm the team size was still appropriate based on the level of work.
The next phase was a planning phase whereby the Ten10 test consultants, led by a test lead, created a suite of functional, integration, and accessibility tests within Hobbycraft’s instance of Jira. These tests were reviewed and approved by the Hobbycraft project team to ensure the correct level of coverage, detail and context.
All the user stories included within the scope of functional testing verified the functionality of Salesforce Commerce Cloud (SFCC) and Business Manager (BM) based on the agreed scope of devices as shown in the below table:
|Device||Screen Resolution, CSS Pixels||Pixel Density|
|Desktop Windows PC||1920 x 1080||MDPI|
|MacBook Pro 13||1280 x 800
1440 x 900
|iPad Air 2||1024 x 786||XHDPI|
|iPhone X||375 x 812||XHDPI|
|Samsung Galaxy S10||350 x 740||XHDPI|
Astound was delivering new features in thrice-weekly agile sprints which were planned out before testing commenced so that Ten10 could ensure the right scripts were being executed and a more efficient approach to testing was used.
Accessibility was a key factor within Hobbycraft’s requirements and Ten10 ensured that the Web Content Accessibility Guidelines (WCAG) 2.1 were planned and implemented to A-level compliance, with some areas to AA level. Ten10 used their internal reusable guideline checklist for the planning phase to ensure all Hobbycraft’s requirements were captured with the WCAG checklist.
A total of 800 functional, accessibility and integration tests were scripted across four weeks with a burndown chart created in readiness to highlight daily progress throughout the execution phase.
The performance preparation consisted of:
- Setting up the load test injection environment
- Creating a performance test pack at HTTP(s) level to simulate client web activity and web services for the agreed scope of Hobbycraft
- Creating test scripts to capture real user experience metrics such as time to first meaningful paint.
Across the execution phase with Hobbycraft there were overall 17 Ten10 testers involved, as follows:
|Activity||No of Ten10 testers|
|Overall Delivery Management Oversight||One Principal Consultant|
|Functional Testing||One Functional Lead and five Test Analysts|
|Integration Testing||Oversight from one Lead and two Test Analysts|
|Accessibility Testing||Oversight from one Lead and two Test Analysts|
|Performance Testing||One Performance Test Lead|
|Automation Factory||One Automation Lead and two Junior Automation Testers|
The execution phase took five months to complete across multiple types of testing. The breakdown of tests by the end of the phase is as follows:
Throughout the execution phase, the Ten10 functional team identified the tests that could be automated and added into an automated smoke test pack so all future releases could be executed multiple times after each build release by the automation factory in parallel to the functional testing being conducted.
As part of the Ten10 execution approach, the Test lead organised sessions with the Hobbycraft business users at the end of each sprint of testing to demonstrate the newly delivered features. They also raised any issues they had identified before raising in Jira with Astound to ensure the correct priority was added. This approach allowed the business users early sight and involvement in how the features functioned and made the whole sign-off and approval process easier when going into production.
All defects that were identified during the project engagement were raised on Jira and triaged with the Hobbycraft team. All P1/P2 issues were to be resolved and any P3/P4 issues were reviewed and accepted by Hobbycraft to be fixed post go-live.
The successful delivery of this project was heavily reliant on regular contact and communication between Ten10’s test team and the Hobbycraft’s technical team and partners. To effectively manage remote working by the test team, a stand-up between the test team and Hobbycraft’s technical and third-party development team was held each morning via Teams. The stand-up allowed both teams to discuss any project risks or blockers, review a daily burndown of the sessions completed, and monitor the defect closure rate against the plan. This ensured there was complete transparency on the project’s progress. It also enabled project priorities to be reviewed and adjusted frequently.
Test Progress Reporting
Below shows the overall burndown chart that was distributed daily in a report to show progress vs plan based on executed tests and retesting effort required defects had been resolved by various third parties.
In a typical test project, the gap between the total number of defects found and the number of defects closed would be expected to shrink over time. However, in the early reporting provided to Hobbycraft it was apparent that as defects were continuing to be identified, the defect closure rate was not gaining on the total number of defects identified.
The below table highlights the total amount of defects raised across the sprints with only minor issues open to add to the backlog for future releases.
Hobbycraft was provided with full access to all defect details, so they were able to review the findings and, where relevant, change the priority assigned based on their in-depth understanding of the planned website usage and the business’ attitude to risk.
To prepare for the performance testing phase, Ten10 worked closely with the Hobbycraft data analytics team to understand the load on the current website along with the forecasted Hobbycraft orders per hour projection. This allowed Ten10 to formulate a series of performance test scenarios which would exercise the key journeys within the site in addition to the amount of load that Hobbycraft is experiencing as well as anticipating within the next five years.
After formulating a subset of journeys based on the data given, Ten10 presented the proposed scenarios and load profiles to Hobbycraft. Once agreed, Ten10 commenced scripting the scenarios and started preparing for the execution.
Once the performance test scripts were ready, the scripts were executed within the pre-production environment. Due to environmental build issues, this execution phase was delayed. However, the performance test team was able to script and do the bulk of the work before the environment switch to ensure that they remained on schedule for the launch.
As the performance tests were executed, all the data was sent to InfluxDB, then visualised in Grafana to help us understand the impact of the load. We were able to identify that the first layer of API calls was performant as would be expected from Salesforce but these calls made a subsequent async downstream call to the Hobbycraft on-prem environment which had a noticeable impact.
Fortunately for Hobbycraft, the SQL server was able to recover from the load during the performance test and Hobbycraft was now aware of the potential issue. The performance testing phase was successful and provided Hobbycraft with the confidence that their website was going to be performant when handling traffic up to their five-year target.
The functional test team created a backlog of manual tests which allowed the automation factory team to start automation straight away. These tests were based on execution frequency, their reliance on data and high-traffic areas, and analysis provided by Hobbycraft.
Over 11 weeks, the automation team created a smoke test pack containing 56 automated tests. These consisted of:
- 50 UI tests which covered high traffic functionality of StoreFront
- 6 API tests to validate the functionality of the Hobbycraft APIs
The project was initially scoped out by the Automation Lead who created the automation framework using Serenity BDD which allowed the team to create cleaner and more maintainable automated acceptance and smoke tests more efficiently.
Once the framework was created, the remaining automation testers were onboarded to script the automated tests supported by the functional test team. Once tests were scripted, they were reviewed by the Automation Lead before the code was merged and the test then became part of either the smoke or regression pack.
Both packs were executed regularly throughout the week or when deployments were made by the third-party development team. Automation demos were hosted throughout the engagement to demonstrate to key project stakeholders the scope of testing, future plans for automation and address any concerns.
An example of an automated report that was produced from each run can be seen below:
- Aggressive go live timescales with multiple types of testing scheduled and environmental constraints. Ten10 mitigated this risk by working with third-party development teams’ release plans and ensuring efficient testing was delivered on time.
- Requirement for dynamic planning to flex around delivery of builds, where the content of a release could change is a relatively fluid manner, meaning the Ten10 team often needed to split and change focus to ensure productive QA against available builds.
- Environment constraints, including functional and performance factors.
- Hobbycraft continuing to build and maintain the automation pack so its value does not decrease over time was a challenge. Hobbycraft could ringfence golden test data which will decrease the failure rate of automated tests and use Jenkins as their integration tool which will execute the automated tests once a new code is added to the code base.
- A smooth upskilling on SFCC across the Ten10 team meant we were quickly established as product SMEs.
- The project was delivered successfully to budget and within the timescales with few defects remaining at go live.
- Confidence from programme and external stakeholders retained at all stages.
- Ten10’s Automation Factory provided a high-value, cost-effective and flexible solution for Hobbycraft to address their automation backlog and ongoing requirements.
- Greater automated test coverage provided confidence in the new features developed for the website, and it will reduce the amount of manual retesting effort required.
- Ten10’s approach to user acceptance signoff not only allowed Hobbycraft business users early sight and involvement within the development but gave them confidence in the features. This approach saved time at the end of a project, avoiding much of the usual rush to complete user acceptance testing before going live.