Our Data Analyst Level 4 apprenticeship develops people to collect, organise and study data to provide business insight. It prepares learners to work across a variety of projects, providing technical data solutions to a range of stakeholders and customers.
Our programme is completely unique, delivered through a leading-edge, innovative approach to challenge-based learning.
This Data Analyst apprenticeship programme is developed using the latest research into effective learning and practical application of skills. Providing a flexible learning journey using online learning, practical exercises, video content, coaching and workshops to enable accelerated and proficient skills development. This apprenticeship programme offers the opportunity for learners to manage learning amongst their other priorities.
On successful completion of the programme the following qualifications are gained:
- Data Analyst Level 4
- BCS Level 4 Certificate in Data Analysis Tools
- BCS Level 4 Certificate in Data Analysis Concepts
- Dell EMC Data Science Associate
The entry requirements for this programme are as follows:
Must have 2x A Levels (including maths or ICT)
- Or A Level 3 apprenticeship in a similar subject
- Or an International Baccalaureate at Level 3 in ICT
- Or A BTEC Extended Diploma (180 credits) In ICT
- And GCSE English and Maths at grade C or above (or equivalent qualifications)
- And Strong English Skills
- Learners must not hold an existing qualification at the same or higher level as this apprenticeship in a similar subject
Job role suitability if already in employment:
To determine whether this programme is suitable for learners you must be able to answer “yes” to the following questions:
- Are they in a full time Data Analyst role (or other similar role as listed in the overview section? (Data analysis must be their full time role, not just part of their responsibilities).
- Will they be using multiple data analysis tools, not just limited to spreadsheets/Excel?
- Will they be using data visualisation tools to present data?
- Will they be collecting and compiling data from different sources – e.g. databases, spreadsheets and reports?
- Will they process, cleanse, analyse (including statistical analysis) and present data on a regular basis?
- Will they be running ad-hoc and standard data analysis reports and working on/assisting with performance dashboards in their role?
- Would data mining and forecasting be an integral part of their role?
- Will they be working on and processing large amounts of data on a regular basis?
- Are they comfortable looking at detailed and complex information?
- Do they have strong English and maths skills?
Note: Speak to your QA Account Manager for more advice on eligibility and job role/existing staff suitability for this programme.
We teach people in the way they tell us they want to learn. Life is busy. People need tech-enabled apprenticeship programmes that resonate with their day-to-day life.
We’ve invested in technology and digital content creation to deliver a ‘high-tech, high-touch’ approach to challenge-based learning for apprenticeships.
‘High tech’ refers to the innovative use of digital technology we use to facilitate our blend of learning methodologies. We deliver ‘mobile-first’ education – this means learning can be accessed anytime, anywhere and on any device, so apprentices get high-quality learning content on the go.
‘High touch’ refers to the many touch points the learners receive to interact with learning, and get support from expert QA training teams through workshops, coaching and online support.
This programme teaches skills in:
- Identifying, collecting and migrating data
- Interpreting data
- Statistical analysis and other analytical techniques such as data mining
- Producing performance dashboards
- Tools and techniques for data visualisation
- Presenting results to stakeholders and making recommendations
Empowering roles like:
- Data Analyst
- Data Manager
- Data Scientist
- Data Modeller
- Data Architect
- Data Engineer
During the programme, learners will be required to spend 20% of work time committed to off-the-job learning.
The modules in our Data Analyst apprenticeship equip learners with the advanced technical skills they need for their role. Each module develops the core set of skills they must be able to do well to be competent in their role.
Our programme modules follow our challenge-based learning approach, they combine multiple modes of learning (classroom, self-paced, and practical learning). Seamlessly delivered through a ‘blend’ of online and classroom learning events.
Our approach is underpinned by the latest learning theory. Learner-centric approaches (where the focus is on how people learn, rather than which technology to use) have been shown to be more effective in promoting productive learning. The advent of new technology has fundamentally changed the way people create, change, share and interact with information, and the way they interact with each other. Methods are collaborative and hands-on – and so is our programme.
Crucial to our challenge-based approach are the remote module tasks, which require learners to work with their peers, tutors and mentors to learn and come up with a solution to a challenge. The cooperative and applied nature of this type of training makes it perfect for learners of all ages.
As part of their programme apprentices will complete:
- Eight knowledge modules teaching advanced theory and its practical application, through a combination of online learning and practical classroom workshops.
- A summative portfolio showcasing how the learner(s) have/has demonstrated the skills they learn in real work projects.
- A synoptic project where apprentice will take a business and technical brief in order to build a finished product.
- An end-point assessment interview carried out by BCS – The Chartered Institute for IT – to assess whether learners have successfully met the learning requirements of the programme.
- Learners will need to complete the Dell EMC Data Science Associate Vendor Qualification as part of the Gateway requirements. Learners will be taught relevant material at Module 8, provided additional resources and Online learning support and will attempt the exam at an external Pearson VUE exam centre.
The modules included in this programme are:
Module 1: Induction to Data
- Understanding what data is, and why it’s needed
- Introducing the Data Protection Act
- Understanding data – what, why, where, who, when
- Writing a basic Python program
- Setting expectations about tools and environment needed
- Using Visual Studio, Python, Excel and PowerBI
Module 2: Data Structure and Databases Using Phyton and SQL
- Installing SQL Server database and management studio
- Familiarisation with SQL commands
- Using Python to do simple statistical analysis
- Developing knowledge of data structures with/without databases
- Writing programs to process data structures using Python
- Using SQL to interrogate data tables
- Using Python to transform files
- Connecting Python to SQL
Module 3: Data Analysis and Compliance
- Understanding and defining the problem
- Developing knowledge of data cleansing and standardisation
- Using charts and visualisations in Excel and Microsoft BI
- Interpreting results
- Documenting and disseminating data
- Complying with the Data Protection Act
- Data cleansing practice using Python and SQL
- Presenting results
- Using SQL Server, Visio, Excel, and Microsoft BI
Module 4: Data Modelling and Database Design
- Understanding conceptual, logical and physical data modelling
- Carrying out logical to physical transformation using SQL server
- Understanding database types: hierarchical, network, OO, dimensional and no SQL
- Understanding the use of specific database types
- Using Visio to model data structures
- Creating databases from models
- Developing knowledge of data warehouse design and construction
- Using SQL to create dimensions from unnormalised data
Module 5: Data Architecture
- Developing knowledge of data architecture vs information architecture
- Understanding rules, policies, standards, and models
- Using meta data
- Developing knowledge of data architecture functions and management
- Understanding data transformation tools and their use in data architecture management
- Knowing the importance of quality standards in any data architecture
- Knowing the importance of maintenance to ensure quality in data architectures and data analysis
- Testing strategies to ensure quality in data architectures and data analysis
- Defining and documenting the data architecture components
- Defining and documenting the metadata using tools and by hand
- Using ETL techniques to create and support the architecture
- Undertaking practical exercises in maintenance
- Designing tests to ensure quality by determining data defects
Module 6: Requirements for Data Architecture and Analysis
- Understanding the need for clear, unambiguous requirements
- Developing knowledge of the classification of different types of requirements and the treatment of them
- Carrying out requirements elicitation including documentation, implicit and explicit requirements and expert knowledge
- Using models in answer to requirements
- Knowing adaptive vs predictive methodologies
- Determining requirements and documenting them, categorising business requirements, functional vs non-functional requirements, technical requirements etc.
- Describing and implementing change control procedures
- Dealing with changing circumstances and unclear requests
Module 7: Integration and Data Analysis Tools 1
- Training and testing the model
- Forming a hypothesis
- Using data analysis tools to perform statistical functions
- Defining mean, median, mode and range
- Using probability, bias and statistical significance
- Understanding linear and logical regression
- Using scatter plots and understanding correlation
- Using factorials and probability
- Using stem and leaf plots
- Using box and whisker plots
Module 8: Integration and Data Analysis Tools 2
- Using statistical analysis
- Interpreting requirements and producing the solution
- Using statistical languages – comparisons and applications
- Understanding online transactional processing (OLTP) vs online analytical processing (OLAP) vs Big Data
- Using data structures and integration to target different structures
- Understanding performant solutions
For more information download the handout
To contact us for more information please fill in the form below.