Apprenticeship requirements


We’re working on ways to improve the apprentice experience by capturing and analysing the many kinds of data that can be collected through the apprenticeship journey.

This research is developing alongside our effective learning analytics project. At the core of the learning analytics service is the learning data hub (formerly called the learning records warehouse) where academic and engagement data is collected, stored and processed. We are planning extend the learning data hub to enable data to be gathered from all aspects of the apprenticeship journey. In a previous blog post we listed some of the possible sources that we can gather data from.

By analysing progress of apprenticeships will be able to make timely and appropriate interventions and enhance and improve the apprenticeship journey.

student-849825_1920 (1)

Apprenticeships is a growth area undergoing massive reform, with a government target of three million starts by 2020 and the implementation of the post-16 skills plan. This is a tenfold increase from the current level of 300,000 apprentices.

This increase means for many employers (as well as providers) that having timely and accurate information about their apprentices is critical and to ensure the successful outcome for those apprentices.


There are a range of requirements that employers will need some, are merely information about progress, other aspects will be based on the analysis of various data sets.

These requirements could include:

• Recruitment
• Induction
• Attendance
• Progress
• Topic coverage
• Optional modules choices
• Skills coverage
• Academic plan
• Timetabling
• Resources
• Assessment plan
• EPA Information
• Highlights report
• Risks
• Personal profiles
• Provider profiles
• Subject profiles

Similarly providers may have similar requirements and may wish to bring in data from employers and the apprentices themselves, to combine with their internal data sources.

• Attendance
• Progress
• Topic coverage
• Assessment plan
• Library Usage
• Retention
• Achievement

We can separate out the requirements, from those that are derived direct from the data and those that are dependent on some aspect of processing and analysis.

We also need to consider what data requirements we’re missing but we hope to discover as we start to gather data and feedback.

Data sources


Within the Digital Apprenticeships project we have been reflecting on the data sources that we would need to extract data from in order to undertake relevant analytics and also display on a future provider or employer dashboard.

When it comes to apprenticeships there are similar kinds of data that can be gathered about non-apprenticeship learners. However some of the sources of this data may not be similar. We are also aware that in many of our member organisations different systems and processes are used with apprentices than with other learners.

The following are potential data sources from the provider:

  • Registration system
  • Attendance monitoring
  • Student Records (MIS or SIS)
  • VLE
  • CMS
  • Library Systems
  • Progress checker (eg Promonitor)
  • e-Portfolio
  • Assessment planner
  • Target settings
  • Web Analytics
  • Tutor reports
  • Quality reports
  • e-Book platforms
  • Video Server

Some of these may be in the same system, but what is important is understanding how to extract data from these systems in a format that can then be stored in the Learning Data Hub (the new name for the Learning Records Warehouse).

The following are potential data sources from the employer

  • HR System
  • Employer LMS (or VLE)
  • Employment register
  • Health & Safety

We know talking to colleges that they are generally not allowed access to employer systems, which is understandable. Though what we need in this project is not access to the systems, but the extraction of data from those systems into the Learning Data Hub.

The following are potential data sources from the apprentice themselves:

  • Social media
  • Blogging
  • e-Portfolio
  • Student Activity
  • self-declared data

As with other systems, we are not looking to access these, but for the data from them to be gathered into the Learning Data Hub.

Some of this data will be static, or generally static, whilst other data sets will be constantly changing and updating on a regular and irregular basis.

Considering we can define the requirements on how this data should be structured, then it won’t matter which systems the provider, employer or apprentice uses, the data can be extracted and added to the Learning Data Hub.


Once the data is in the hub then we can start to analyse the data and see what we can learn and help the apprentice on their journey.

Higher and degree apprenticeships project: think tank outcomes


We held a Think Tank event in Manchester on 14 November 2017 to inform the research for this project. The event was attended by 21 participants from 17 different organisations (mainly universities but with government and professional bodies also represented). Discussions were conducted under the Chatham House rule and the following is a summary of the outcomes.

Aims of the meeting

To consult with the community in order to inform Jisc guidance going forwards. To:

  • Discuss the changes to the landscape
  • Summarise initial findings from survey
  • Identify and prioritise the challenges to be addressed
  • Surface examples of policy, practice & resources that can usefully be shared
  • Identify and prioritise actionable insights from consultation to date

Background and context

Adrian Anderson, chief executive of the University Vocational Awards Council (UVAC) set the scene by discussing the background and policy context to the introduction of the new apprenticeship standards.

Link to Adrian’s slides UVAC JISC Slides

These are a few of the points made by Adrian:

  • The government definition of an apprenticeship specifies that it is a ‘job’ not a qualification. It is a job that involves a programme of training designed to enable the individual to gain the knowledge, skills and behaviours needed to become competent in a defined occupation. This is a very different approach to how we think about most of the education we deliver.
  • We need to be wary of talk about ‘exponential’ growth in the number of degree apprenticeships as that growth started from a very low base. Apprenticeships are also not providing a ready progression route into higher education as the number of level four and five apprenticeships is currently very low.
  • Many HE courses are currently offered due to student demand, there is concern within the sector that employer demand will be significantly different and therefore require substantial review and changes.
  • We are faced with a situation where the employer is the customer and providers will be judged on employer satisfaction to a much greater extent than learner satisfaction. Similarly, HEIs will need to be marketing to employers and offering a different kind of package to the marketing they currently do to individual students.
  • Providers are having to adapt to a different statutory framework and in particular one that is founded on more of an inspection basis than the trust basis we have in HE.

Gill Ferrell talked about the outcomes of the project to date highlighting some of the potential implications for universities and colleges and some of the observations made in discussions with the sector.

Link to Gill’s slides Think Tank i1

What are the pain points?

The pain points as indicated by survey responses to date were highlighted and discussed.

Painometer white background

Understanding the issues

Participants talked about issues generally and then voted on which of the topics they would like to explore further. These are some of the issues raised:

Learning, teaching and assessment design

  • Employer design groups are now in the driving seat and there is concern in the sector that providers have not always been sufficiently involved in the design of both standards and assessment.
  • It was suggested that the design of some degree apprenticeships is simply too long and too challenging for students who already have demanding jobs e.g. a 6 year course for solicitors . A good design academically would be likely to offer more flexibility/stepping off points with interim qualifications but this is not how the apprenticeship standards work.
  • Some institutions have tried to adapt what they already do rather than changing how they work and they are finding that fitting approaches to areas such as grading, marking and academic regulations to existing procedures is extremely difficult.
  • Designing learning that involves collaborative group work and peer review is more challenging in this context. In some subject areas there is so much concern about confidentiality that people are veering away from collaborative work and action learning sets altogether. It is the case that student peers may work for employers who are in direct competition with one another. It is also harder to design peer work when students are outwith the ‘safe’ environment of the classroom. N.B. some participants questioned whether students already in work actually have the same need to undertake peer work as other students. This was a minority view and most participants recognised peer work as particularly beneficial to apprentices even if it is not specifically required by the standard.
  • There can be issues around the academic ownership of student output.
  • There was considerable discussion about approaches to learning design i.e. whether to adapt existing offerings (brownfield site) or start again. Many people appeared to be trying to adapt existing offerings and finding this difficult but few participants felt they were really given the time to redesign from scratch.
  • There was criticism that some online offers are of poor quality (and are indeed being rejected by savvy employers).
  • It was suggested that some design of apprenticeship courses falls into the same trap as some online/distance learning i.e. not realising how different the audience and their needs actually is.
  • It was noted that there is a lot more work involved in delivering apprenticeships because you are having to deliver a set of knowledge, skills and behaviours and liaise with employers on top of everything you would normally do for a degree.

End point assessment (EPA)

  • There is a serious lack of clarity around EPA at the moment (some went so far as to describe the situation as chaotic) and in many cases providers do not know who will do the EPA for students currently on programme.
  • There is a feeling that statutory bodies and policymakers are naive about providers’ ability to design learning geared towards an EPA that is not yet well defined.
  • Some HEIs reported that they are working with external EPA providers who keep ‘moving the goalposts’.
  • The limited choice of EPA centres is being compounded by the fact that some centres are already closing down.

Working with employers

  • There are a number of different roles involved in the relationship between employer, provider and learner (e.g. line managers, mentors, finance and legal teams at both ends) making it different to, and more complex than, the usual relationship between an institution and its students. There may also be other delivery partners and external EPAs.
  • Each of these stakeholders may have different expectations and there can be tensions in negotiating the formal contract between them e.g. employers may want to use contracts that don’t fit the funding rules or comply with HEFCE good practice. Employers take the view that they are procuring a service and they are in charge whilst HEIs are concerned that they are carrying the risk of delivery.
  • There can be a lack of clarity about exactly what the 20% off the job element of the apprenticeship actually consists of and tensions around releasing apprentices to undertake study. HEIs are worried about what will happen if they are audited and the evidence of 20% off the job learning is not deemed to be adequate.
  • There are particular issues for employers who work with multiple universities and find the approaches very different. Participants raised a particular query about how apprentices with the same employer but studying in different universities would perceive differences in assessment practice.
  • Apprentices with different employers can have very different support hence we are starting from a position where there is not a level playing field and equity in terms of support for learners undertaking the same degree.
  • Mentors in the workplace often have significant staff development needs as the skills required for the mentoring role or not the same as those needed for their day job. There is a cost associated with this.
  • There are many issues around meeting employers information needs. Relationships are very different and universities talked about employers asking for information about apprentices that the university cannot ethically provide.
  • The need to work with employers throughout the whole period of the apprenticeship from initial recruitment through ongoing progress checking, monitoring and review was noted.
  • Course leads may not be particularly interested in employer needs as they see their educational role as much wider than this. Academic aims may differ from employer needs e.g. employers might not see the development of critical thinking skills as a priority if they cannot see why apprentices need the skills to be able to do the job.
  • Questions were raised about the impact on TEF and other metrics, such as NSS, if apprenticeships and other degrees are not viewed separately.

Data and systems issues

  • There are issues around meeting statutory and funding requirements that have grown out of FE and which HE sector corporate systems were not designed for. Even fundamental processes such as student enrolment are different.
  • Many institutions have muddled through with ad hoc solutions that were satisfactory when apprentice numbers were very low but these solutions will not scale up effectively.
  • There is a need for a shared online environment with employers.
  • There are data integration issues, compounded by a lack of institutional understanding about the whole degree apprenticeship data model and data flows. Our current systems are based on quite rigid models.
  • There is a proliferation of systems, technologies and processes within the institution leading to complexity and duplication of work.
  • Students are overburdened with multiple technologies e.g. the VLE e-portfolio, assessment tools, submission system etc.
  • The relationship to professional body accreditation requires integration with further systems in many cases.
  • Software licensing throws up many issues as there are tensions between academic and commercial use.
  • We don’t currently have systems for measuring employer satisfaction.
  • There are many concerns over data integrity.
  • Staff development needs are high and there are significant workload issues.
  • Budget structures are different to traditional programmes and finance systems struggle to cope.


Identifying actionable insights

We also identified many examples of good practice in addressing some of the issues raised above. These will be followed up with the institutions named and used to inform further guidance.

These are some of the suggestions for activities and outputs that could help the sector.

Learning, teaching and assessment design

  • Create structured tools for curriculum design that are specific to apprenticeships addressing the role of the standard and knowledge, skills and behaviours. Also guidance on how to approach curriculum design and how to review your designs. A work-based learning or version of the Viewpoints cards was also suggested.
  • Create a mapping of the academic development cycle matched to the standards life cycle.
  • Guidance on EPA and how to interpret different elements of the EPA e.g. what types of assessment might fall under the heading ‘exam’?
  • Guidance on issues relating to data management and GDPR particularly: partnerships, online assessment, data theft, release of confidential data by students, what information you can supply back to employers.
  • Guidance on effective use of e-portfolios.
  • Feed examples of authentic assessment back to the wider community.

Working with employers

HEIs could use help in managing expectations to ensure successful delivery. There is also a critical lack of user support about what apprenticeship actually is and the roles and responsibilities involved.

  • Provide a glossary of terms to facilitate conversations (see the Jisc COGENT project).
  • It would be useful to have a map/infographic of the relationships covering topics such as: roles; tensions; relationships; information flows; responsibilities; accountability; communication needs; statutory requirements. Jisc could create a generic map onto which institutions could map their own relationships.
  • It would be useful to see an idealised communication approach based on what has been shown to work with employers showing the information flows, connections and loops and examples of the systems and tools that have been used to deliver this.
  • Provide an online service for managing the relationship with employers that includes facilities for digital signatures and is linked to the ULN and DAS.

Data and systems issues

HEIs could use help in developing a common understanding of technical infrastructure, information flows, and data management relating to degree apprenticeships.

  • Define the system requirements.
  • Develop a reference model for technology.
  • Provide guidance on integrated VLE and E portfolio solutions.
  • Create an employer satisfaction toolkit.

Update on the review process of our new visual guide ‘The apprenticeship journey in a digital age: Provider toolkit’

draft image

Earlier this year we launched the beta version of our provider toolkit ‘Apprenticeship journey in a digital age’. The purpose of the toolkit is to show how effective use of digital technologies can help in the delivery of the new apprenticeship standards. It is aimed at colleges and training providers and has a ‘dip in’ format that is designed to accommodate the needs of a range of staff: senior managers may need to review the overview sections while other staff may drill down to specific content relevant to their practice.

Mindful of the pressure on the working lives of staff in the sector, this more visual format is a deliberate approach designed to make our guidance more easily accessible, breaking down the apprenticeship journey into four stages of preparation, planning, delivery and assessment. For each stage we offer a succinct overview of how digital technologies can support training providers and enhance the apprenticeship experience with access to case studies, further reports and guidance.

As part of the development process we have sought feedback from providers and stakeholders over the summer. Through a series of interviews and workshops we have engaged with over 100 people. Overall, the feedback has been very positive.

Review feedback – what users liked
Users really like the succinct and visual format and found it helpful to get a high level overview of each topic. The layout and ease of access of the resource was well-received and the topic headings reflected sector needs. The positive way in which digital opportunities are highlighted was helpful along with the surfacing of common issues and the easy access to relevant additional resources on specific topics.

Provider case studies were welcomed with the proviso that users really value the ‘warts and all’ approach to case studies – they want to see how challenges have been overcome and to know the pitfalls to avoid.

The graphical elements of the guide provoked a very ‘marmite’ response with differing views – many found it innovative and felt it helped to convey the overall journey but some did not understand the concept and didn’t find the iconography helpful – we are working on these aspects!

And what they didn’t like or would like more of
Understandably, there are concerns about currency as policy and practice in apprenticeship delivery evolves. To some extent, we are reliant on the sector being sufficiently confident and willing to share practice that they find to be effective as we develop the toolkit further.
Users requested more examples from curriculum areas that are regarded as ‘hard to reach’ but acknowledged that this is difficult without consensus as to which curriculum areas fall into this category.

Some felt that the resources linked to from the toolkit are still very text-based, sometimes rather ‘wordy’ and don’t reflect the variety of digital formats available. This is something to address as we gather future examples of best practice and build a broader range of digital assets.

Some smaller providers are seeking further guidance on developing an appropriate digital infrastructure. The breadth of practice in the sector and the different technological starting points means it is difficult to address the wide spectrum of issues providers face. The screen offering a ‘Technology view’ is perhaps the most challenging aspect of the resource and we are taking further guidance on this aspect.

Next steps and how you can get involved
We are working with a specialist web design company to develop the resource, to bring the concept to life and to address some of the issues raised. We hope to launch the next version early in 2018 but in the meantime, please do get in touch with Lisa Gray if you have an example of effective use of digital interventions relating to apprenticeship delivery to share.

Higher or degree apprenticeships
We are exploring how digital technologies can best support the delivery of the new apprenticeships at higher and degree level and held a ‘think tank’ event to explore this with providers on 14 November 2017. See our blog post on the higher and degree apprenticeships project.

In the meantime, if you are involved with, or considering, the delivery of higher or degree apprenticeships we would be grateful if you would complete our short survey to share your experiences with us and help us to better understand your issues and priorities.

Digital Apprenticeships mailing list


As the project moves through Alpha we will use the blog to update members and the community on progress.

We have also created a mailing list for people who are interested in the work we are undertaking, to find out more about the project, and how potentially to get involved in the alpha and beta phases.

The mailing list can also be the place to discuss issues that interested organisations are facing in embedding the the use of technology into supporting apprentices, the challenges of incorporating digital in the delivery of content and activities.

We will also use the mailing list to tell people about forthcoming community events, other Jisc events such as Digifest, and other relevant events and workshops.

You can sign up to the mailing list using this link.

Higher and degree apprenticeships project

Helping further and higher education providers deliver apprenticeship standards using digital technologies

Degree Apprenticeship Infographic Vikki

What we’re doing

We are working with the sector to:

  • understand the requirements and where they present challenges
  • identify and disseminate existing good practice
  • find solutions to common issues

The progress so far section of this page will be updated with the outcomes of each stage of the work.

Why this matters

Apprenticeships is a growth area undergoing massive reform, with a government target of three million starts by 2020. The employer levy funding which began in April 2017 is estimated at £2.5 bn, and we are already seeing signs that employers are choosing to spend the funds available to them at the higher levels.

Increasing and more effective use of technology will be crucial to meeting the needs of this new employer led approach whilst maintaining high quality.

Employers want to see efficient and flexible delivery models, developing the required skills whilst minimising impact to their business. Student apprentices are used to accessing information when and where they want it and they too want flexible access to learning.

Despite increasing awareness of the potential of technology to support apprenticeship delivery, there are many practical obstacles.

Universities and colleges are finding that simply adjusting existing offerings is often insufficient to meet changing requirements and they are having to rethink how they design and deliver a very different type of learning experience.

How this will help you

Making the most of the opportunities afforded by the new apprenticeship standards requires a strategic approach. From course design and marketing, day-to-day learning and teaching practice, managing and sharing information and supporting students through to ongoing customer relationship management the approach is different and needs to be coordinated. We are working with staff in many different roles to address all of these aspects.

Who we are working with

We are working with the Quality Assurance Agency for Higher Education (QAA) and the University Vocational Awards Council (UVAC) as well as directly with apprenticeship providers.

Progress so far


We conducted a survey about your issues and priorities that closed in January 2018.

You can find the outcomes of the survey here.

Think Tank

We held a Think Tank event with a range of experts and practitioners in Manchester on 14th November 2017.

The discussions took place under the Chatham House rule but some outputs of the day are available here.


We have delivered a digital apprenticeships toolkit aimed particularly at providers of level 2-4 apprenticeships. Much of the good practice is equally relevant to higher and degree levels.

You can access the beta version of the toolkit here. A new version is due for launch January 2018.

Get in touch

If you want to find out more about this work please contact Lisa Gray, senior co-design manager

An employer’s perspective


The relationship dynamic between learner and provider is different if the learner is an apprentice.

On a traditional programme, the relationship is between the provider and the learner. The University and the student is a good example of that dynamic. Likewise the A Level learner and their local FE College is another. This dynamic is about how each side views their responsibilities towards each other.

When we come to apprentices the dynamic is different, the relationship dynamic is between the provider and the employer, and the employer has a different kind of relationship to the apprentice, that of an employer. The responsibilities in this case are different and are recognised by providers who have been working with employers for many years.

We know talking to colleges that many have established processes and procedures for communicating the progress of apprentices with the apprentices’ employers. 

Employers have also spoken about how they communicate with providers about the progress of their apprentices.

Suppliers of proprietary software that records the progress of apprentices talk about the functionality that enables different views for apprentices, providers and employers. 

However there is a different perspective that is going to shift this dynamic and that is the proposed increase in the number (and type) of apprentices.

Apprenticeships is a growth area undergoing massive reform, with a government target of three million starts by 2020 and the implementation of the post-16 skills plan. This is a tenfold increase from the current level of 300,000 apprentices.

What this could mean for employers is that an employer that maybe had five apprentices in one vocational area may suddenly find they have fifty apprentices across multiple vocational areas. Rather than working with a single provider, they may find themselves with multiple providers across different apprenticeships and different levels.

We will also see new employers who traditionally not employed apprentices, but with the introduction of the employer levy will want to make use of this funding for probably existing staff, but also potentially new apprentices. The employer levy funding began in April 2017 is estimated at £2.5bn, a billion pounds larger than now. Whichever route they go down will mean that they will be engaging with providers on apprenticeship programmes.

This all means that though existing systems and processes, which are working fine, may not necessarily be fit for purpose over the next three years and beyond. It is in this landscape that the Digital Apprenticeships project Jisc is working on will emerge to support providers to fulfil the changing needs of employers.

Entering Alpha

Entering Alpha


Following a successful transition meeting last month the Digital Apprentice project has moved from Discovery to Alpha.

In the discovery phase we explore new ideas and emerging technology to establish which ideas meet Jisc member’s needs, are technically feasible, fit Jisc’s remit and stand a chance of becoming sustainable services. If an idea passes all these criteria, we move to the alpha phase.

The Digital Apprentice has been looking at three-way communication between employer, provider and apprentices following the introduction of the employer levy for apprenticeships.  As employers increasingly look to ensure they are maximising investment in their staff through the levy they are increasingly likely to be contracting with a wider range of providers, whilst wanting to look across their entire cohort of apprentices to monitor activity and plan progression.

As part of the discovery phase we developed dynamic guidance for apprenticeship providers on embedding technology in design, delivery and assessment which will be online Jan 18 (watch this space for more news).

So now we enter the alpha phase. Our focus is now on developing the data infrastructure which we will build on top of the learning records warehouse to facilitate multi provider dashboards for employers.

We will develop the tri-partite dashboards with a small number of organisations to iterate and see whether they offer real world benefits. If they do, we move onto the beta phase.

how we innovate

Digital Apprenticeships – next steps

Following our successful workshop with representatives from universities, colleges and training providers a couple of weeks ago, we have been looking at what come next. The outcomes from the workshop showed us that we are broadly on track in terms of the kind of solution that we think we can build to help employers keep tabs on their apprentices and their providers, and the interactions between these last two.

Our thinking has turned now to how to build some sort of dashboard that will present the information about apprentices that employers will need. Jisc has some useful experience in dashboards and working with data from multiple sources and has its own a consolidated data source – the Learning Records Warehouse, a key part of the Learning Analytics cluster activity.

We will investigate how we can build a framework to hold “widgets” that can pull information from different sources. One widget may be a feed showing attendance data for apprentices from the college or university, or outcomes of assessment activities, engagement (data from institution VLE, etc) as well as data drawn from the employers’ systems such as health and safety inductions and incidents, leave and other absences and end point assessment (EPA) information.

Mock up of a possible dashboard

Mock up of a possible dashboard

The next steps are to begin to identify the actual widgets and to begin to work through from the user stories that describe the requirements to identifying data sources that will be required to populate them before moving on in time to user interface design and building the dashboard.

Workshop Report: What would truly digital apprenticeships look like?

As the discovery phase of the Digital Apprenticeship project continues, we wanted to ensure that the sector had an opportunity to have their say in the ideas we had, as well as provide a fresh opinion on shaping our work.


Nearly thirty people from across HE, FE and Skills attended our consultation workshop in London on the 14th June.

Sue Attewell started the day off, providing an overview of where we were, background to the co-design challenge, work undertaken so far, and the reasons we are heading down this road. She explained the process we use to decide what our priorities are and how we consult with the sector. Sue then went onto explain the progress we had made after the initial co-design challenge and where we were now. The delegates were then given an overview of the day and what we hoped, working together, we could achieve.

The first activity of the day was led and facilitated by Rob Bristow which was ranking and reflecting on user stories. User stories are an informal description of a feature of a system – usually from the perspective of a user of the system.


In earlier workshops we had gathered user stories from the perspectives of the key players in the digital apprentice space (employers, providers and apprentices), and we first asked out participants to rank a sample of these user stories that we saw as particularly germane to this activity. We also asked the participants to add more user stories, from the point of view of these same groups if they saw gaps in what we had provided.

One area that came out strongly in this exercise as lacking in the supplied stories was the point of view of the finance functions for both employers and providers. There was a suggestion that a way of easily accessing the spend on apprenticeships against the levy pot and other financial metrics would be helpful. The point was also made that apprenticeships raise all kinds of issues around statutory reporting and that this something that is likely to be a burden across the board. People also raised concerns that apprenticeships are a three way contractual relationship and that there are some thorny issues to be dealt with in this respect, not least the question of what happens when employers go out of business with apprentices short of qualification? Not all apprenticeship standards have qualifications embedded in them, and there are questions hanging over the transferability of any progress against these standards when employers are forced to close.


Overall, the exercise reinforced our belief that there is both a need and an appetite for a means for employers and provider to be able to share a holistic picture of their apprenticeship activities, and that there could be real benefits to the sector in pursuing this approach


Over lunch we had many different conversations about apprenticeships and how digital can be used to enhance and improve that journey.

After lunch James Clay ran an activity on paper prototyping. These paper prototypes can provide an insight into the user experience that digital development or textual based process might not. In human–computer interaction, paper prototyping is a widely used method in the user-centered design process, a process that helps developers to create software that meets the user’s expectations and needs—in this case, especially for designing and testing user interfaces. Using pen and paper means that the focus is on the user experience (UX) and the functionality of the tool. It means that the technical development team have a clear steer from the project team on how to push forward the technical development, which data sets to use and how to process that data, as well as how potentially it could look to the end user on a dashboard. Combined with user stories it supports and aids technical development.


Our delegates split into groups and using their own ideas came up with their own paper prototypes that will we use to feed into our development plans and models. One group usefully focused on language and principles that dashboards and apps should use. It was recognised that this can have both a positive and negative impact on users.

We ended the day with an overview and feedback from delegates

We also mentioned the Apprenticeship and Technical Education Providers Digital Leaders programme running in September in Leicester.

The programme will support managers and senior staff to become a digitally-informed and empowered leader and they will learn how to help their organisation respond more effectively to technology-driven change. Our four-day digital leaders programme will equip you with the tools, knowledge and skills to:

  • Become a more effective digital leader through your own personal and professional development
  • Explore how organisations can engage more effectively with the digital technology at their disposal – at both strategic and operational levels
  • Discover and reflect on how digital technology is changing the way your organisation operates – creating new leadership challenges and strategic opportunities
  • Learn to lead, manage and influence digitally-driven change across organisations, departments, services and teams

Find out more