The apprenticeship toolkit is now live

Tools, Repair, Construction, Work, Screwdriver, Home

The Apprenticeship toolkit shows how effective application of digital technologies can support the planning, design, delivery and assessment of apprenticeships.

The step-by-step toolkit shows the actions to cover at each stage together with potential opportunities and pitfalls.  it also clearly highlights specific examples where technology can be positively exploited.

Apprentices have slightly diff rent needs to other learners, particularly because they are also employees spending most of their time on the job learning.  its important they are not left isolated but feel the benefits of a supportive and inclusive processes in their learning environment.

Technology can play a huge role in improving motivation and keeping them connected to their teachers and peers.

This version of the toolkit is aimed at colleges and training providers (including employer-providers), and organisations delivering end point assessment (EPA) in England, but much of the content has relevance and applicability across nations.

Developed in collaboration with providers and employers we hope you enjoy using the toolkit, and welcome any feedback.

Coming soon……. Degree apprenticeships and Delivering apprenticeships in Wales.  Please keep an eye on this space for further news.

 

Higher and degree apprenticeships survey outcomes

We recently ran a survey intended to support preparation of a forthcoming Jisc guide to the use of technology in delivering higher and degree apprenticeships. The survey was launched in October 2017 and closed in January 2018.

We received 49 responses from 37 different organisations. The breakdown of respondents by background was:

  • 42 higher education providers
  • 5 HE in FE providers
  • 2 other (government department & independent training provider)

Most of the respondents were in senior management positions relating to academic development or having specific responsibility for apprenticeships/vocational education so we can be confident that the results paint a realistic picture of the state of play.

You can find a summary of the outcomes attached.

Survey report i1

CANCELLED – Digital Apprenticeship Community Event – 19th April 2018 – Portsmouth

CANCELLED

We are sorry to notify you that we’ve had a limited number of bookings for the Digital Apprenticeship Community Event at Portsmouth on Thursday 19 April and we have therefore decided to cancel the event.

We will be running a future community event later on in the year so please do look for any updates on our website.

Our apologies for any inconvenience caused.

35198897531_d9675c67df_k

Following the first successful community event in London, we are running a second community event at The University of Portsmouth on the 19th April 2018. The event runs from from 10:30 to 3:30 and lunch will be provided.

If you’re working in the area of apprenticeships and are interested in how digital can improve the whole apprenticeship journey, we’d like to invite you to join this digital apprenticeships community event, the focus of which will be degree apprenticeships.

Our event will give you an opportunity to network, share practice and hear what Jisc – and various organisations – are doing in this space.

You’ll also have the opportunity to discover more about our digital apprenticeships project and emerging toolkits in this area.

Who should attend – Staff in colleges and universities working in the area of apprenticeships and those who are interested in how digital can improve the whole apprenticeship journey.

More information

Book now

Snow didn’t stop play

m.670_ice-cricket

Last Wednesday 28th February some hardy souls managed a public transport challenged trip to London and gathered in Brettenham House, London for the very first Digital Apprentice Community Event.  Whilst we were down somewhat in expected numbers the quality of discussion and the quantity of refreshments was excellent.

Kicking off with a look at the background to the project and how we had to got this far, we then moved on to look at the prototype and completed the mandatory jisc post-it activity facilitated by Rob Bristow.

DA comm event

Following that Paul Bailey walked us through Learning analytics and the on boarding process

James Clay completed the morning session with a barriers and gaps activity using the bridge metaphor:

What is the solution to the problem that enables the desired outcome:

The problem:   How do I cross the river without getting wet and continue my car journey?

The enabler:  A bridge solves the problem by taking me over the bridge

The solution:   I continue my car journey

As the snow was getting fast and furious we had a short but exceedingly generous lunch before resuming  with an excellent case study from Gilmar Queiros, Apprenticeship Development Manager at Staffordshire University.  Gilmar very kindly shared the lessons learnt by Staffordshire in the hope of saving pain for others and his key takeaway:  focus on what successful delivery looks like for you and then work back from there.

I finished with a tour of our soon to be launched (at Digifest March 2018) apprenticeship toolkit which supports providers to embed technology in planning, preparation, delivery and assessment of apprenticeships, with lots of handy tips, guides and case studies.   Keep an eye on the blog for further news.

Thanks to all our attendees, who kept the questions flowing and contributed to a very vibrant day and I hope you all had safe journeys home, with minimal delays.

Our next event is in Portsmouth on 10th April 2018 more details here.

Slides from the day

As a… When I am… I want to… so that I can…

adult-3052244_1920

User stories are an ideal vehicle for understanding the needs of users to inform product and service development.

A user story is an informal, natural language description of one or more features of a software system. User stories are often written from the perspective of an end user or user of a system.
User stories are a few sentences in simple language that outline the desired outcome. They don’t go into detailed requirements.

how we innovate

As we move along the innovation pipeline in the Jisc Digital Apprenticeships project we are not only using user stories to support the development of the project, but also to avoid scope creep and to inform the project team about the different requirements of the different users in the product process.

If we write user stories for the key players, the provider, the apprentice and the employer, we can ensure that the final product is meeting their actual needs and not the needs that we, the project team, think they have. There may be multiple users within those categories, the Apprentice Manager at a provider may have a very different user story compared to an instructor who is training the apprentice. Individual users may have multiple stories as well.

They provide the project team with focus and clarity.

We have a simple structure, that allows to have a shared understanding of the user needs.

As a… When I am… I want to… so that I can…

For example.

  • As an apprentice.
  • When I am undertaking work based training.
  • I want to record what I have done.
  • So that I can share what I have learnt with my college.

We consult and talk to users to check and confirm these stories. Once we have the user stories in place, we can then check the product against them to ensure we are on track with the project.

References
https://en.wikipedia.org/wiki/User_story
https://www.atlassian.com/agile/delivery-vehicles

Getting ready for Digital Apprentice

We are soon going to be starting to recruit institutions to be part of the pilot group for Digital Apprentice, so it seems timely to describe what will be needed to enable an institution to join in.

Digital Apprentice will be using the same key infrastructure as Jisc’s existing Learning Analytics service, which uses provider data build dashboards for the use of teachers and others to monitor and get predictions of students’ progress and likely attainment.

In the same was a Learning Analytics,  Digital Apprentice works by bringing institutional data into Jisc’s Learning Data Hub, one of the first steps for an institution is to be in a position to sign Jisc’s data sharing agreement. This legal agreement has recently been updated to be fully GDPR compliant and adapted to be explicitly about data sharing about Apprentices.

Essentially the agreement (in amongst the legal stuff) sets out that Jisc and the provider will share the provider data (as defined) for the purposes of delivering the service. The agreement makes it clear that the provider will retain ownership of the data, but grants a licence to Jisc to use the data for the defined purposes. The provider remains the Data Controller, while Jisc is the Data Processor.

Screenshot 2018-02-09 17.48.06

As the Data Processor Jisc is obligated to process the provider data only in connection with the agreed purpose, to be in accordance with applicable law, to do this in a secure manner and to let the provider know of any data breaches. There is also a clause prohibiting disclosure to a third party (except appointed sub-contractors), and to allow the provider to audit compliance of these requirements.

Signing the data processing agreement is the crucial first step to taking part in the project, as Jisc cannot touch your data without it.

Once these legal formalities are out of the way the process of coming on board the project is:

  1. Extraction of data from institutional systems. This involves setting up an SFTP link, devising “recipes” to extract data from student record systems, the VLE, attendance data, etc.
  2. Data validation and quality check before that data then enters the data warehouse (Jisc’s Learning Data Hub).

From there the next steps will be to pull the data through to the Digital Apprentice dashboard, and work with the institution to fine tune that to be ready for the employers.

The whole on-boarding process is well covered on the Learning Analytics Blog:

https://analytics.jiscinvolve.org/wp/on-boarding/

To find out more about Jisc’s Digital Apprentice activity, come to the Community Event on the 28th February at Jisc’s central London offices.

https://www.jisc.ac.uk/events/digital-apprenticeships-community-event-28-feb-2018

To keep in touch with our Digital Apprentice work follow this blog and join our mailing list:

https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=DIGITALAPPRENTICESHIPS&A=1

 

Digital Apprenticeship Community Event 28th February 2018

35198897531_d9675c67df_k

If you are working in the area of apprenticeships and are interested in how digital can improve the whole apprenticeship journey then we would like to invite you to attend the first of our community events.

The community of practice gives people an opportunity to network, share practice, hear what various institutions are doing and what Jisc is doing in this space.

The first of these events is taking place at the Jisc London offices on the 28th February 2018 from 10:30 to 3:30 and lunch will be provided.

You will have the opportunity to discover more about the Jisc project that is being undertaken about apprenticeships, as well as our new toolkits in this area.

Find out more about the event.

Book onto the event.

Apprenticeship requirements

workstation-405768_1920

We’re working on ways to improve the apprentice experience by capturing and analysing the many kinds of data that can be collected through the apprenticeship journey.

This research is developing alongside our effective learning analytics project. At the core of the learning analytics service is the learning data hub (formerly called the learning records warehouse) where academic and engagement data is collected, stored and processed. We are planning extend the learning data hub to enable data to be gathered from all aspects of the apprenticeship journey. In a previous blog post we listed some of the possible sources that we can gather data from.

By analysing progress of apprenticeships will be able to make timely and appropriate interventions and enhance and improve the apprenticeship journey.

student-849825_1920 (1)

Apprenticeships is a growth area undergoing massive reform, with a government target of three million starts by 2020 and the implementation of the post-16 skills plan. This is a tenfold increase from the current level of 300,000 apprentices.

This increase means for many employers (as well as providers) that having timely and accurate information about their apprentices is critical and to ensure the successful outcome for those apprentices.

office-594132_1920

There are a range of requirements that employers will need some, are merely information about progress, other aspects will be based on the analysis of various data sets.

These requirements could include:

• Recruitment
• Induction
• Attendance
• Progress
• Topic coverage
• Optional modules choices
• Skills coverage
• Academic plan
• Timetabling
• Resources
• Assessment plan
• EPA Information
• Highlights report
• Risks
• Personal profiles
• Provider profiles
• Subject profiles

Similarly providers may have similar requirements and may wish to bring in data from employers and the apprentices themselves, to combine with their internal data sources.

• Attendance
• Progress
• Topic coverage
• Assessment plan
• Library Usage
• Retention
• Achievement

We can separate out the requirements, from those that are derived direct from the data and those that are dependent on some aspect of processing and analysis.

We also need to consider what data requirements we’re missing but we hope to discover as we start to gather data and feedback.

Data sources

meeting-2284501_1920

Within the Digital Apprenticeships project we have been reflecting on the data sources that we would need to extract data from in order to undertake relevant analytics and also display on a future provider or employer dashboard.

When it comes to apprenticeships there are similar kinds of data that can be gathered about non-apprenticeship learners. However some of the sources of this data may not be similar. We are also aware that in many of our member organisations different systems and processes are used with apprentices than with other learners.

The following are potential data sources from the provider:

  • Registration system
  • Attendance monitoring
  • Student Records (MIS or SIS)
  • VLE
  • CMS
  • Library Systems
  • Progress checker (eg Promonitor)
  • e-Portfolio
  • Assessment planner
  • Target settings
  • Web Analytics
  • Tutor reports
  • Quality reports
  • e-Book platforms
  • Video Server

Some of these may be in the same system, but what is important is understanding how to extract data from these systems in a format that can then be stored in the Learning Data Hub (the new name for the Learning Records Warehouse).

The following are potential data sources from the employer

  • HR System
  • Employer LMS (or VLE)
  • Employment register
  • Health & Safety

We know talking to colleges that they are generally not allowed access to employer systems, which is understandable. Though what we need in this project is not access to the systems, but the extraction of data from those systems into the Learning Data Hub.

The following are potential data sources from the apprentice themselves:

  • Social media
  • Blogging
  • e-Portfolio
  • Student Activity
  • self-declared data

As with other systems, we are not looking to access these, but for the data from them to be gathered into the Learning Data Hub.

Some of this data will be static, or generally static, whilst other data sets will be constantly changing and updating on a regular and irregular basis.

Considering we can define the requirements on how this data should be structured, then it won’t matter which systems the provider, employer or apprentice uses, the data can be extracted and added to the Learning Data Hub.

alpha

Once the data is in the hub then we can start to analyse the data and see what we can learn and help the apprentice on their journey.

Higher and degree apprenticeships project: think tank outcomes

Img_4199

We held a Think Tank event in Manchester on 14 November 2017 to inform the research for this project. The event was attended by 21 participants from 17 different organisations (mainly universities but with government and professional bodies also represented). Discussions were conducted under the Chatham House rule and the following is a summary of the outcomes.

Aims of the meeting

To consult with the community in order to inform Jisc guidance going forwards. To:

  • Discuss the changes to the landscape
  • Summarise initial findings from survey
  • Identify and prioritise the challenges to be addressed
  • Surface examples of policy, practice & resources that can usefully be shared
  • Identify and prioritise actionable insights from consultation to date

Background and context

Adrian Anderson, chief executive of the University Vocational Awards Council (UVAC) set the scene by discussing the background and policy context to the introduction of the new apprenticeship standards.

Link to Adrian’s slides UVAC JISC Slides

These are a few of the points made by Adrian:

  • The government definition of an apprenticeship specifies that it is a ‘job’ not a qualification. It is a job that involves a programme of training designed to enable the individual to gain the knowledge, skills and behaviours needed to become competent in a defined occupation. This is a very different approach to how we think about most of the education we deliver.
  • We need to be wary of talk about ‘exponential’ growth in the number of degree apprenticeships as that growth started from a very low base. Apprenticeships are also not providing a ready progression route into higher education as the number of level four and five apprenticeships is currently very low.
  • Many HE courses are currently offered due to student demand, there is concern within the sector that employer demand will be significantly different and therefore require substantial review and changes.
  • We are faced with a situation where the employer is the customer and providers will be judged on employer satisfaction to a much greater extent than learner satisfaction. Similarly, HEIs will need to be marketing to employers and offering a different kind of package to the marketing they currently do to individual students.
  • Providers are having to adapt to a different statutory framework and in particular one that is founded on more of an inspection basis than the trust basis we have in HE.

Gill Ferrell talked about the outcomes of the project to date highlighting some of the potential implications for universities and colleges and some of the observations made in discussions with the sector.

Link to Gill’s slides Think Tank i1

What are the pain points?

The pain points as indicated by survey responses to date were highlighted and discussed.

Painometer white background

Understanding the issues

Participants talked about issues generally and then voted on which of the topics they would like to explore further. These are some of the issues raised:

Learning, teaching and assessment design

  • Employer design groups are now in the driving seat and there is concern in the sector that providers have not always been sufficiently involved in the design of both standards and assessment.
  • It was suggested that the design of some degree apprenticeships is simply too long and too challenging for students who already have demanding jobs e.g. a 6 year course for solicitors . A good design academically would be likely to offer more flexibility/stepping off points with interim qualifications but this is not how the apprenticeship standards work.
  • Some institutions have tried to adapt what they already do rather than changing how they work and they are finding that fitting approaches to areas such as grading, marking and academic regulations to existing procedures is extremely difficult.
  • Designing learning that involves collaborative group work and peer review is more challenging in this context. In some subject areas there is so much concern about confidentiality that people are veering away from collaborative work and action learning sets altogether. It is the case that student peers may work for employers who are in direct competition with one another. It is also harder to design peer work when students are outwith the ‘safe’ environment of the classroom. N.B. some participants questioned whether students already in work actually have the same need to undertake peer work as other students. This was a minority view and most participants recognised peer work as particularly beneficial to apprentices even if it is not specifically required by the standard.
  • There can be issues around the academic ownership of student output.
  • There was considerable discussion about approaches to learning design i.e. whether to adapt existing offerings (brownfield site) or start again. Many people appeared to be trying to adapt existing offerings and finding this difficult but few participants felt they were really given the time to redesign from scratch.
  • There was criticism that some online offers are of poor quality (and are indeed being rejected by savvy employers).
  • It was suggested that some design of apprenticeship courses falls into the same trap as some online/distance learning i.e. not realising how different the audience and their needs actually is.
  • It was noted that there is a lot more work involved in delivering apprenticeships because you are having to deliver a set of knowledge, skills and behaviours and liaise with employers on top of everything you would normally do for a degree.

End point assessment (EPA)

  • There is a serious lack of clarity around EPA at the moment (some went so far as to describe the situation as chaotic) and in many cases providers do not know who will do the EPA for students currently on programme.
  • There is a feeling that statutory bodies and policymakers are naive about providers’ ability to design learning geared towards an EPA that is not yet well defined.
  • Some HEIs reported that they are working with external EPA providers who keep ‘moving the goalposts’.
  • The limited choice of EPA centres is being compounded by the fact that some centres are already closing down.

Working with employers

  • There are a number of different roles involved in the relationship between employer, provider and learner (e.g. line managers, mentors, finance and legal teams at both ends) making it different to, and more complex than, the usual relationship between an institution and its students. There may also be other delivery partners and external EPAs.
  • Each of these stakeholders may have different expectations and there can be tensions in negotiating the formal contract between them e.g. employers may want to use contracts that don’t fit the funding rules or comply with HEFCE good practice. Employers take the view that they are procuring a service and they are in charge whilst HEIs are concerned that they are carrying the risk of delivery.
  • There can be a lack of clarity about exactly what the 20% off the job element of the apprenticeship actually consists of and tensions around releasing apprentices to undertake study. HEIs are worried about what will happen if they are audited and the evidence of 20% off the job learning is not deemed to be adequate.
  • There are particular issues for employers who work with multiple universities and find the approaches very different. Participants raised a particular query about how apprentices with the same employer but studying in different universities would perceive differences in assessment practice.
  • Apprentices with different employers can have very different support hence we are starting from a position where there is not a level playing field and equity in terms of support for learners undertaking the same degree.
  • Mentors in the workplace often have significant staff development needs as the skills required for the mentoring role or not the same as those needed for their day job. There is a cost associated with this.
  • There are many issues around meeting employers information needs. Relationships are very different and universities talked about employers asking for information about apprentices that the university cannot ethically provide.
  • The need to work with employers throughout the whole period of the apprenticeship from initial recruitment through ongoing progress checking, monitoring and review was noted.
  • Course leads may not be particularly interested in employer needs as they see their educational role as much wider than this. Academic aims may differ from employer needs e.g. employers might not see the development of critical thinking skills as a priority if they cannot see why apprentices need the skills to be able to do the job.
  • Questions were raised about the impact on TEF and other metrics, such as NSS, if apprenticeships and other degrees are not viewed separately.

Data and systems issues

  • There are issues around meeting statutory and funding requirements that have grown out of FE and which HE sector corporate systems were not designed for. Even fundamental processes such as student enrolment are different.
  • Many institutions have muddled through with ad hoc solutions that were satisfactory when apprentice numbers were very low but these solutions will not scale up effectively.
  • There is a need for a shared online environment with employers.
  • There are data integration issues, compounded by a lack of institutional understanding about the whole degree apprenticeship data model and data flows. Our current systems are based on quite rigid models.
  • There is a proliferation of systems, technologies and processes within the institution leading to complexity and duplication of work.
  • Students are overburdened with multiple technologies e.g. the VLE e-portfolio, assessment tools, submission system etc.
  • The relationship to professional body accreditation requires integration with further systems in many cases.
  • Software licensing throws up many issues as there are tensions between academic and commercial use.
  • We don’t currently have systems for measuring employer satisfaction.
  • There are many concerns over data integrity.
  • Staff development needs are high and there are significant workload issues.
  • Budget structures are different to traditional programmes and finance systems struggle to cope.

 

Identifying actionable insights

We also identified many examples of good practice in addressing some of the issues raised above. These will be followed up with the institutions named and used to inform further guidance.

These are some of the suggestions for activities and outputs that could help the sector.

Learning, teaching and assessment design

  • Create structured tools for curriculum design that are specific to apprenticeships addressing the role of the standard and knowledge, skills and behaviours. Also guidance on how to approach curriculum design and how to review your designs. A work-based learning or version of the Viewpoints cards was also suggested.
  • Create a mapping of the academic development cycle matched to the standards life cycle.
  • Guidance on EPA and how to interpret different elements of the EPA e.g. what types of assessment might fall under the heading ‘exam’?
  • Guidance on issues relating to data management and GDPR particularly: partnerships, online assessment, data theft, release of confidential data by students, what information you can supply back to employers.
  • Guidance on effective use of e-portfolios.
  • Feed examples of authentic assessment back to the wider community.

Working with employers

HEIs could use help in managing expectations to ensure successful delivery. There is also a critical lack of user support about what apprenticeship actually is and the roles and responsibilities involved.

  • Provide a glossary of terms to facilitate conversations (see the Jisc COGENT project).
  • It would be useful to have a map/infographic of the relationships covering topics such as: roles; tensions; relationships; information flows; responsibilities; accountability; communication needs; statutory requirements. Jisc could create a generic map onto which institutions could map their own relationships.
  • It would be useful to see an idealised communication approach based on what has been shown to work with employers showing the information flows, connections and loops and examples of the systems and tools that have been used to deliver this.
  • Provide an online service for managing the relationship with employers that includes facilities for digital signatures and is linked to the ULN and DAS.

Data and systems issues

HEIs could use help in developing a common understanding of technical infrastructure, information flows, and data management relating to degree apprenticeships.

  • Define the system requirements.
  • Develop a reference model for technology.
  • Provide guidance on integrated VLE and E portfolio solutions.
  • Create an employer satisfaction toolkit.