General Certificate of Secondary Education ICT

Similar documents
GCSE Geography A Controlled Assessment Report on the Examination June Version: v0.1

LEVEL 3 EXTENDED PROJECT QUALIFICATION

GCSE Business Studies

abc Report on the Examination Applied Science 8771/8773/8776/8779 General Certificate of Education Finding out about Substances

GCE GEOGRAPHY. Unit 4A Geography Fieldwork Investigation Report on the Examination June Version: 1.0

abc Report on the Examination Applied Business 8611/ examination - June series General Certificate of Education A2 Portfolio Units

GCSE EXAMINERS' REPORTS

IGCSE ICT 4385, NOVEMBER 2005 CHIEF EXAMINER S REPORT

A-LEVEL ELECTRONICS. ELEC6 Practical System Synthesis Report on the Examination June Version: 1.0

klm Report on the Examination Economics ECON examination - June series This Report on the Examination uses the new numbering system

BUSS4 The Business Environment and Managing Change Mark scheme

AS Computer Science. Paper 1 Additional Questions Mark scheme V1.0 8/4/15. 1 of 6

A-LEVEL Business Studies

GCE ICT. OCR Report to Centres June Advanced GCE A2 H517. Advanced Subsidiary GCE AS H117. Oxford Cambridge and RSA Examinations

Report on the Examination

Version 1.0. General Certificate of Secondary Education June Business Studies. (Specification 4133) Unit 1: Setting up a Business.

COMPUTER SCIENCE. Programming Project Guidance GCSE (9 1) ocr.org.uk/gcsecomputerscience. J276 For first assessment in 2018

HIJ. Mark Scheme. Applied Business 8611/8613. General Certificate of Education. Business Communication and Information Systems

Teacher Resource Bank

GCSE Business Studies

Mark Scheme. Business and Communication Systems. (Specification 4134) Unit 8: ICT Systems in Business

The Assessment of Portfolio Units in A-level Applied Science

Evidence Booklet OCR Award and Certificate in Employability Skills

AS Economics. 7135/1 The operation of markets and market failure Mark scheme June Version 1.0: Final Mark Scheme

Version : 17/03/2011. Functional Skills Certificate. Functional English Reading Level 1. Mark Scheme examination March series

Evidence Booklet. OCR Award and Certificate in Employability Skills

GCSE Business Studies (Short Course)

COMPUTER SCIENCE. Programming Project Guidance GCSE (9 1) ocr.org.uk/gcsecomputerscience. J276 For first assessment in 2018

A-Level Business. 7132/1 Paper 1 (New) Final Mark Scheme June Version/Stage: v1.0

Principal Examiner Feedback. Summer 2010

Final. Mark Scheme ICT INFO3. (Specification 2520) Unit 3: The Use of ICT in the Digital World. General Certificate of Education (A-level) June 2011

GCSE Teacher Guidance Leisure and Tourism Controlled Assessment. Unit 2: Customer Service in the Leisure and Tourism Industry

GCSE EXAMINERS' REPORTS

GCSE BUSINESS AND COMMUNICATION SYSTEMS Unit 9 Using ICT in Business Mark Scheme

AS Economics. ECON1/2 Markets and Market Failure Mark scheme June Version: 1.0 Final Mark Scheme

GCSE Applied Business

Report on the Examination

Mark Scheme. English 47251/B. Post Standardisation. Functional Skills Certificate. Reading Level examination - January series

1 Assessor guidance. Additional Conditions of Use (Assessment Materials)

Mark Scheme. Business and Communication Systems. (Specification 4134) Unit 8: ICT Systems in Business

LEVEL 3 Certificate/Extended Certificate in Applied Science

PMT A LEVEL ECONOMICS. ECON1/Unit 1 Markets and Market Failure Mark scheme June Version 0.1 Final

GCSE Geography B F Managing places in the 21 st century Report on the Examination June Version: v0.1

Application Forms Guide

abc Mark Scheme Economics 5141 General Certificate of Education 2005 examination - June series ECN1 Markets and Market Failure

Evidence Booklet. OCR Award and Certificate in Employability Skills /10400/10401/10402/10403/10404 Unit 4: Know how to complete a job search

Technical Qualifications. Marking and Moderation Centre Guide

Evidence Booklet. OCR Award and Certificate in Employability Skills

GCSE ECONOMICS /Unit 11 Personal Economics Mark scheme June Version 1.0 Final

The statistics used in this report have been compiled before the completion of any Post Results Services.

Level 2 Diploma in Reception Services ( )

Higher National Unit specification: general information

Confirm Work Activities and Resources for the Work (SCQF level 6)

General Certificate of Education (A-level) January 2011 ICT INFO3 (Specification 2520) Unit 3: The Use of ICT in the Digital World Final Mark Scheme

AS Economics. 7135/2 Paper 2 The National Economy in a Global Context Final Mark scheme June Version/Stage: v1.0

Higher and Extended Project Qualifications ( /03)

ASDAN Key Skills in Problem Solving. Level 1 Specification

Version: 1.0:07/08. abc. General Certificate of Education. Economics ECN2: The National Economy. Mark Scheme examination - June series

HIJ. Mark Scheme. Business Studies General Certificate of Education. Planning and Financing a Business examination June series

General Certificate of Secondary Education Applied Business (Double Award)

A-LEVEL ACCOUNTING. ACCN3 Further Aspects of Financial Planning Report on the Examination June Version: 1.0

GCSE Business & Communication Systems

GCSE Business and Communication Systems. OCR Report to Centres June General Certificate of Secondary Education J230. Oxford Cambridge and RSA

General Certificate of Education (A-level) January 2012 ICT INFO3 (Specification 2520) Unit 3: The Use of ICT in the Digital World Final Mark Scheme

General Certificate of Education Economics 1141 ECON2: The National Economy Mark Scheme

AS Business Studies. 7131/1 Paper 1 Mark scheme June Version 1.0: Final Mark Scheme

PMT. AS Economics. 7135/2 The national economy in a global context Mark scheme June Version 1.0: Final Mark Scheme

GCE. Leisure Studies. Report on the Units. June 2008 H128/H528/MS/R/08. Advanced GCE A2 H528 Advanced Subsidiary GCE AS H128

Version 1.0: 0309 HIJ. General Certificate of Education. Business Studies Managing a Business. Mark Scheme examination June series

PC Passport: Working with IT Software Spreadsheet and Database (SCQF level 6)

GCSE Business Studies

Version 1.0: abc. General Certificate of Education. Economics ECN1. Markets and Market Failure. Mark Scheme examination - January series

Archway School BTEC Assessment and Internal Verification Protocol

abc GCE 2005 January Series Mark Scheme Economics ECN2/1 & ECN2/2 The National Economy

Higher National Unit specification: general information. Graded Unit 2

QCF. Career Information and Advice. Guidance for Candidates. Level 4 Diploma Scheme code OCR Level 4 Diploma in Career Information and Advice 1

AS-LEVEL HISTORY. Unit HIS1B Report on the Examination. Specification 2040 June Version: 1.0

Report on the Examination

Higher National Unit specification: general information. Graded Unit 3

NQ Verification Key Message Reports

GCE. Applied ICT. OCR Report to Centres. June Advanced GCE AS H515/H715 Advanced Subsidiary GCE AS H115/H315

Principal Moderator Feedback. Summer GCSE ICT (5IT04) Creating Digital Products

GCSE Business and Communication Systems. OCR Report to Centres June General Certificate of Secondary Education J230

Art and Design: Design Activity (Higher)

Care Project General assessment information

Mark Scheme. Applied Business BS11. (Specification 8611/8613/8616/8617/8619) Unit 11: The Marketing Environment (External Test)

AIM Awards Suite of Healthy Living (QCF) Qualifications

A-Level Economics. Paper 3 Economic principles and issues Final Mark scheme June Version/Stage: v1.0

Continuing Professional Development (CPD) Guide

Business Management Assignment Assessment task

Higher National Unit specification: general information. Graded Unit 1

Evidence Booklet. OCR Award and Certificate in Employability Skills /10400/10401/10402/10403/10404 Unit 10: Learn how to manage money.

Version 1.0. klm. General Certificate of Education June Geography Fieldwork Investigation. Unit 4A. Mark Scheme

UNIT Engineering: Using Information Technology (SCQF level 5)

GCE. AS and A Level. Business Studies. AS exams 2009 onwards A2 exams 2010 onwards. Unit 2: Specimen mark scheme. Version 1.2

Specimen Mark Scheme

AIM Awards Suite of Darts Coaching (QCF) Qualifications

APPRENTICESHIP CERTIFICATION CHECKLIST

REVIEWING DELIVERY AND ASSESSMENT OF COMPONENT 3 (NEA) Get Going: FIRST TEACHING FROM 2016 OCR GCSE (9-1) Computer Science (J276)

Examiners Report Summer 2007

Transcription:

General Certificate of Secondary Education ICT Practical Problem Solving in ICT Report on the Examination 45202 June 2016 Version: 1.0

Further copies of this Report are available from aqa.org.uk Copyright 2016 AQA and its licensors. All rights reserved. AQA retains the copyright on all its publications. However, registered schools/colleges for AQA are permitted to copy material from this booklet for their own internal use, with the following important exception: AQA cannot give permission to schools/colleges to photocopy any material that is acknowledged to a third party even for internal use within the centre.

REPORT ON THE EXAMINATION General Certificate of Secondary Information 45201 June 2016 Unit 2: The Assignment: Applying ICT The 2016 Unit 2 was based on helping to organise a school gardening club. The work presented showed high standards of attainment from many centres, with candidates indicating their understanding of the requirements of each task, both in terms of the task itself and also the evidence needed to produce a documented solution to it. The Candidate Booklet initially outlined two tasks that candidates were to undertake: To set up a system to record the sales of plants and produce To set up an interactive presentation about the club There was also a model to produce (only assessed in Implementation: Evidence of solution). Summary There was a slight decrease in the number of centres whose candidates were out of tolerance this year. However some candidates, from the same centres as last year, were again out of tolerance. Each centre is encouraged to use their annually appointed controlled assessment adviser where they have any questions. All centres which used and submitted the Unit 2 Mark Grid enabled the moderator to comment on any differences between the AQA standard and the centre s own marking. There were a number of candidates Unit 2 folders seen which did not have any teacher annotation to support the marks awarded, as required by the JCQ rules. This continues to be surprising, as it is an area where teachers can indicate where they found the evidence to support their marks. Some centres did not use the spreadsheet version of the mark grid and as a result, frequently under marked their candidates as the spreadsheet rounds up the marks and the centres did not. In design and implementation, candidates from some centres did not produce all the items required, especially for Task 1 where many minor parts were missed off. It is sensible to encourage candidates to use a checklist at the end of both sections to ensure all parts of the task have been tackled. In testing, plans from some students lacked the precision to be fully usable (ie the test data was not specific and the expected results were not always shown in the correct order). The effect of not having checkable expected results was that these candidates could not confirm whether their test worked or not. A significant minority of candidates included a vast number of tests which were not requested in the Candidate Booklet and so could gain no marks. The report should be formal (for example, it should include the name of the recipient, the sender, the date created and its purpose). Candidates should begin the report with issues raised by the thoughts of Mr Rutherford three issues are sufficient for this purpose. After explaining them, candidates should then proceed to make recommendations about how they may be solved. A small number of candidates, usually from the same centres, misinterpreted the report as being about "what mistakes they had made or how the task was completed". The Evaluation of others use of ICT was not always tackled in the correct way. Although there were suitable attempts to comment on the work of another student, candidates didn t follow this through to show how it could help them to self-improve if undertaking a similar task in future. This section is not about how the other student could improve their solution. When assessing centre work, AQA continues to recommend that centres use the materials in the Teachers Online Standardising area to familiarise themselves with the standard. 3 of 10

Analysis As required in the specification, the analysis work for both tasks should be completed and presented as a discrete section at the beginning of the candidates work for Unit 2. This was appropriately carried out by nearly all candidates. It was clear from the evidence presented that, in the main, candidates had been well prepared for this component of controlled assessment. In addition, most centres annotated the work by showing ticks against correctly identified criteria or including completed Standard Analysis Mark Grids. Where either of these was included it was usually straightforward to be able to support the centre marking. Following the completion of this work, centres must then provide each candidate with the Standard Analysis to use for each task they are to undertake. The evidence suggested that this had usually been undertaken correctly. Design There are two elements to design: Planning; Explaining the choices made. The work presented showed varying levels of addressing the two elements required for each task: planning how the task is to be solved using either hand drawn plans or computer drawn plans and explaining why the choices were made. Many candidates find it easier to list information for the sketches on separate pieces of paper, for example, writing Image 1 on the sketch and then detailing the image (name, location, file type, size, effects etc) separately. This is to be encouraged as candidates without very small handwriting often struggle to complete all the detail on the sketch itself. Additionally a copy of the testing plan should be included in this section, which will eventually be credited in the testing section. This was rarely seen. In general, candidates designed plans for the requirements that were to be implemented, namely: Task 1, part 1: create a system to store information provided about plants and produce; use drop down lists to simplify data entry; create an efficient way to enter new items (switchboard or similar, plus a data entry form) Task 1, part 2: create a list of items that are not selling well; use a parameter query, or similar, to allow any specified number of plants to be used; show only the relevant fields; sort in order of sales (fewest at the top); show that it is the school s gardening club (by means of a logo or name); make sure the list is eye-catching; include a picture of something the club grows Task 1, part 3: add a calculated field to show the profit for each item Task 1, part 4: produce a list of those items making more than 20 profit; sort list so the most profitable is at the top; select the relevant fields to use; show that it is the school s gardening club (by means of a logo or name); make the list show the charity being supported and that it is well-presented; add a button to the switchboard (or similar) to allow easy access to the list Task 1, part 5: change the price of fuchsias to 5 and reprint the profit list Task 2: create an interactive presentation for the club showing: Page 1: four facts from the Gardening club facts file 4 of 10

Page 2: link to the most profit list from task 1 Page 3: information about the supported charity Page 4: information about one item grown by the club To achieve a mark in the higher ranges, a candidate should provide sufficient detail for a third party to carry out the implementation from the plan. There were a number of cases where the mark awarded did not reflect the work presented. For assessment purposes, the planning and design choices should be marked separately and then added together for each task. The electronic mark grid will average the totals for both tasks and record the result in the summary section of the grid. Centres are encouraged to use the electronic mark grid as it dramatically reduces calculation errors. Plans were submitted as indicated above. When using computer drawn formats it must be clear to the moderator that the plan does not include evidence of implemented work (which would be credited there and not in design). Many of the computer drawn plans included work that was clearly taken from the Implementation. There is no need for candidates to recreate every font style, fill pattern etc. in the design it is not an art project. It is acceptable to state what the choices are. Noting that the fill pattern will be, say, dark blue fading horizontally to light blue is acceptable rather than either drawing this pattern or setting it up on the computer. In a large number of out of tolerance centres the plans were not appropriately marked with the most common reason being that they would not be third party implementable due to missing information. Task 1: candidates should include planned evidence of the structure of the database to be created. There are many ways of tackling each part and no one approach is better than others so long as all of the performance criteria are met in full. It is acceptable for the candidates to print off screen shots of QBE grids to complete by hand rather than spending time drawing lots of boxes. Each section listed below would need to have a separate sketch to allow all the information to be seen: Gardening club system Import file given, showing how this will be done use drop down lists to simplify data entry check suitable field types are used create a switchboard or similar create a data entry form List of items that are not selling well create a parameter query create a report based on this query select relevant fields (either at query or report stage) sort in order of sales, fewest at the top (either at query or report stage) include school garrdening club s logo or name make the list is eye-catching include a picture of something the club grows Add a calculated field to show the profit for each item 5 of 10

List of those items making more than 20 profit create a query (either hard wired or parameter) create a report based on this query sort list so the most profitable is at the top select relevant fields (either at query or report stage) include school garrdening club s logo or name make the list show the charity being supported and is well-presented add a button to the switchboard (or similar) to allow easy access to the list It is expected that these plans would be judged separately and averaged to produce an overall planning mark. Where a significant number of plans were not present the marks for the others must be averaged. For example, if there should be 4 plans with a total possible mark of 12 and only 1 plan is submitted then the mark obtained by that 1 plan (out of 12) would need to be divided by 4 to get the overall planning mark. Task 2: candidates should show the creation of the interactive presentation, including all the items listed above. The plans for task 2 were, on the whole, much better than task 1 and it was more usual to see accurate marking of these. Design choices: many candidates used the desired outcomes and performance criteria to assist them. This is to be encouraged. Their explanation of why these will meet the users needs is an important aspect of making choices. This should explain why the candidate has chosen a specific way of presenting, say, something on the plan. It does not need to be in a separate section but many candidates find it easier to complete if it is separate. The points below arose from some work submitted: - High marks were being awarded without the necessary explanation of choices being made; - Whilst some, limited, credit can be awarded for candidates indicating their own design choices, awards in higher mark ranges must relate to the correct criteria; - Where candidates only give a reason for their own choice it may be awarded up to 4 marks; - A very simple choice may take the form of because I have been told to do this by Mr Rutherford, this is not sufficient for marks of 7 plus without further explanation; - Not all the choices can be explained and this should be taken in to account when allocating marks. 6 of 10

Implementation There are three elements to implementation: Show skills, understanding and efficiency in building the solution for both tasks; Show evidence of the solution to meet the criteria set for both tasks and the model of running the gardening club showing the profit or loss for the club; Annotate how the solution was built or what the solution shows for each task. Candidates were required to implement these requirements: Task 1, part 1: create a system to store information provided about plants and produce; use drop down lists to simplify data entry; create an efficient way to enter new items (switchboard or similar, plus a data entry form) Task 1, part 2: create a list of items that are not selling well; use a parameter query, or similar, to allow any specified number of plants to be used; show only the relevant fields; sort in order of sales (fewest at the top); show that it is the school s gardening club (by means of a logo or name); make the list eye-catching; include a picture of something the club grows Task 1, part 3: add a calculated field to show the profit for each item Task 1, part 4: produce a list of those items making more than 20 profit; sort list so the most profitable is at the top; select the relevant fields to use; show that it is the school s gardening club (by means of a logo or name); make the list show the charity being supported and that it is well-presented; add a button to the switchboard (or similar) to allow easy access to the list Task 1, part 5: change the price of fuchsias to 5 and reprint the profit list Task 2: create an interactive presentation for the club showing: Page 1: four facts from the Gardening club facts file Page 2: link to the most profit list from task 1 Page 3: information about the supported charity Page 4: information about one item grown by the club. Work presented for Skills, understanding and efficiency was variable but, in the case of higher ability candidates, often of an appropriate standard. There were a number of candidates who had not shown evidence of some key stages (the building blocks in creating the solution) but were awarded a higher mark range for this element. The key stages will also count as earlier stages of creation in the next element (Evidence of Solution). Where a candidate only shows a final version of the solution then the skills needed can only be implied, which mean that only a very low mark may be awarded. The Evidence of the solution can only be achieved by comparing what the candidate has produced against the criteria set (ie the Standard Analysis). Some good practices was seen where candidates used the Standard Analysis to indicate where in their solution this had been achieved. Similarly, Annotation can be awarded high marks (5+) where candidates have explained/ described/stated how they produced the solution, whereas when they state what they have done or will do next, rather than how it was produced, it limits their mark to 4 or fewer. Task 1: candidates needed to do all the items listed above in the design section. The system was usually set up well, using database software. It is to be noted that any type of software can be used for this, and other tasks, so long as all of the task is achievable. Candidates cannot be excused 7 of 10

from undertaking parks of a task just because they have chosen a less suitable package to complete the work. Many candidates provided evidence for most of the tasks. In a task with many sub sections, like this, it is advisable for candidates to use a check list to confirm they have do all they are required to do. A copy of the standard analysis is ideal for this purpose. It does not have to be submitted as it is just for the candidate s own use. The weakest sections were the calculated field and the efficient way to enter new items. Some candidates did not include anything that showed efficiency of use and a number of centres had clearly not taught calculated fields as none were present in the work of the more able candidates. Task 2: this task required the creation of an interactive presentation for the club. Generally this was done well by candidates who produced effective stages towards the final solution. Many candidates produced good evidence of parts of the solution. Efficiency was evidenced by higher ability candidates showing the embedding of the data from task 1. Most candidates appreciated that evidence of repeated skills is not required. This concept applies to all of the tasks but is particularly relevant to this one. The final printout should be displayed in colour. This is the one outcome that was frequently missing from candidates work. Model: when a model was produced, candidates work on modelling the profit or loss for the show was generally good. Where errors existed they tended to be for not using roundup functions. In Evidence of solution, where there is no model present, the marks for tasks 1 and 2 should be totalled and divided by 3. Testing There are two elements to testing: Creating the testing plan; Showing evidence that the test has been carried out and checked against the plan. The plan should identify the purpose of the test, the test data and expected results. In many out of tolerance centres the test plans were lacking in detail, often without specific test data or expected results and this severely limited the marks that could be awarded for this section. A number of centres had produced many additional tests but failed to test what there were asked to. It is the testing that is requested in the booklet that gains credit, nothing else. The expected results, which should be found by the candidate using the hard copy of the data file, must show all the relevant fields that will appear on the list. It is not appropriate to show the actual results, e.g. a screenshot of the database, in the testing plan in place of the expected results. The expected results should be a prediction of the event not the evidence that the event has occurred this is for the actual results section. Lack of expected results in the Testing plan will restrict the test evidence to a maximum of 3 marks. Testing evidence should show clearly labelled results of testing, which are cross referenced to the testing plan. For the award of the highest mark ranges it should be evident (eg using s, comments and/or marks on the printout) that the candidate has actually checked that the results are the same as expected. However, evidence can only be checked if there is a set of expected results to check it against and this will impact on marks if it is not present. Task 1: the test was to produce a list of all the items of which fewer than three were sold. Most candidates used a parameter query to produce this. The test data should make it clear what would 8 of 10

need to be typed into the dialog box to produce the correct results. Testing evidence was usually completed correctly but some candidates did not show a screen shot of the completion of the parameter query dialog box. Task 2: the test was to show that all the links worked from the first slide. The testing plans were usually correctly set up for the test although occasionally candidates did not include all of the relevant tests as there should have been more than one link on the first slide. The testing evidence was not always completed correctly as there needs to be some evidence that the links actually exist, for example a screenshot of the relevant dialog box or line of code clearly highlighted. Self-evaluation Most candidates were provided with and used the desired outcomes and performance criteria from the Standard Analysis. This offered them the opportunity to comment on their own solutions. For discussion or description of the effectiveness of their solutions, candidates should focus on at least three of the criteria and consider these in detail. They should also briefly evaluate all other criteria. Report Some candidates achieved very well in this section. Candidates who were successful created a formal report, including the name of the recipient and the date. Candidates then introduced the issues involved and progressed to making recommendations to solve them. This should be done by considering a few aspects of the problem (three is sufficient) and incorporating research where needed. It is very important that candidates attempt the thoughts presented by the organiser rather than invent their own. Few, if any, marks can be awarded for a candidate s own thoughts on how well their project progressed. Evaluation of others use of ICT There was still considerable variation in the suitability of candidate s responses to this section. In almost all cases, candidates appeared to have used an appropriate solution from another student. Candidates should consider three desired outcomes/ performance criteria and for each criteria consider how well the other student met the criteria and how looking and that work would influence their own future projects. Most candidates considered three criteria but they were not always the most suitable ones. Candidates should pick criteria that they can write about in depth, such as make it look professional rather than there must be a logo on each page. There is no need to consider more than three criteria. At the end of this section there should be an overall summary of the effectiveness of the other student s work. This was frequently missing. Administration matters Centre administration remained similar to previous years. There were still a small number of candidates who did not present their work in the correct order or whose work was disorganised. Securing work neatly with a treasury tag is the AQA recommended way of presenting finished assignments. Centres whose candidates are submitting work electronically should ensure that they follow the guidance in the Teachers Notes. Only pdf or Word versions of correctly labelled files will be moderated. 9 of 10

The occasional lack of internal standardisation continues to have a significant effect on candidates awards. To ensure consistency, centres must standardise their marking across different teachers assessing candidates work. Teacher annotation: it is a requirement of the Code of Practice that controlled assessment work is annotated by the teacher to indicate how marks are awarded; it is evident that centres which did annotate candidates work were more likely to have their marking agreed. It is perfectly acceptable for this type of annotation to be a phrase which simply indicated where in the candidates work a particular criterion had been met. A positive aspect was that many centres did submit the Unit 2 mark grid with the candidates work and this was particularly helpful in being able to confirm the accuracy of centre marking. Many centres who continued to use the paper-based mark grid made arithmetical errors and frequently averaged marks were not being rounded up, as per the Teachers Notes. As the majority of centres had used the electronic mark grid, there were fewer arithmetical errors on: - the Candidate Record Form - the transfer between the above and electronic submission of the marks All appropriate paperwork needs to be fully completed and signed - including the Centre Declaration Sheet and the Candidate Record Form. Nearly all centres sent the correct forms with the work although there were a small number of assignments without a candidate number and the necessary Candidate Record Form. Failure to comply with these requirements can cause delays in carrying out the moderation process. All necessary up-to-date paperwork can be located and downloaded from the AQA website. Mark Ranges and Award of Grades Grade boundaries and cumulative percentage grades are available on the Results Statistics page of the AQA Website. Converting Marks into UMS marks Convert raw marks into Uniform Mark Scale (UMS) marks by using the link below. UMS conversion calculator 10 of 10