GCSE Geography A Controlled Assessment Report on the Examination June Version: v0.1

Size: px
Start display at page:

Download "GCSE Geography A Controlled Assessment Report on the Examination June Version: v0.1"

Transcription

1 GCSE Geography A Controlled Assessment Report on the Examination 9030 June 2015 Version: v0.1

2 Further copies of this Report are available from aqa.org.uk Copyright 2015 AQA and its licensors. All rights reserved. AQA retains the copyright on all its publications. However, registered schools/colleges for AQA are permitted to copy material from this booklet for their own internal use, with the following important exception: AQA cannot give permission to schools/colleges to photocopy any material that is acknowledged to a third party even for internal use within the centre.

3 General This was the sixth year of Controlled Assessments and, as the GCSE course is now linear, the majority of the 98,000 candidates whose work was moderated were year 11 students. The options available for centres are limited to eleven tasks set by AQA, one of which must be selected for the investigation. Fieldwork must be evident within the work and centres need to contextualise the task to meet local circumstances and opportunities. Controlled Assessment Advisers are allocated to centres to offer advice in relation to the appropriateness of tasks and the data collection methods involved. They also help centres understand and interpret the assessment criteria and the Levels of Control involved. With the Controlled Assessments candidates have to complete all of the work, apart from data collection, in no more than 20 hours under the direct supervision of teachers or other members of staff at centres. None of the work can be completed at home. The Controlled Assessment studies moderated this year often had a very clear focus because candidates investigated a single hypothesis or key question, as required by the specification. In the best investigations, the geographical concepts and processes studied were clearly evident and applied accurately throughout the work. The full range of marks was seen and most centres were able to allow clear differentiation to take place. Standards of organisation and presentation were variable, but the best work moderated was outstanding. The majority of the work seen was teacher directed, but teachers are not allowed to guide students during the High Level Control phases of the task. The majority of centres applied the assessment criteria accurately and consistently so their sample of investigations was within tolerance. The assessment criteria were clearly understood by the teachers in these centres and the progression evident within the criteria had informed their planning. There were, however, centres out of tolerance because one or more studies had not been accurately assessed within the centres concerned. One issue that arose in a very small number of centres was that of work being returned to candidates so that changes could be made in response to comments made by the teachers. This is not allowed and triggers a malpractice review of the work from the centre. Writing frames and templates are not allowed for this component. Centres can identify a range of possible presentation and organisational strategies for candidates but pre-prepared sheets cannot be provided for candidates to use. In some instances moderators saw examples of such sheets in Methodology and Evaluation and some of these carried too much supporting material for the candidates and malpractice investigations resulted. 3 of 11

4 Administration Where there are no more than 20 candidates, centres must send all of the work to the moderator and not wait for a sample to be requested. Some centres failed to include Centre Declaration Sheets with the work, or with the marks, and these had to be requested by the moderator. Examinations Officers could assist the moderation process by ensuring that all appropriate forms are sent with the sample. There were many instances of inaccurate recording of marks on the Centre Mark Sheets. Some candidates had no marks recorded and others had two different marks recorded because errors had been over-written and both marks encoded. Centres must make alterations clear when encoding the Centre Mark Sheets. Moderators also saw errors in the addition of marks awarded to candidates and this sometimes had an impact on the sampling process so additional work had to be requested. Photocopies of pages from mark books cannot be used to submit marks to moderators. Most centres carried out the administrative requirements with commendable accuracy and efficiency and this certainly assisted the moderation process. Unfortunately some centres failed to send work to moderators on time, and it was mid-june before all of the samples were finally received for moderation. Centres should not use any form of postal or delivery service where a signature is required for the receipt of documents or work. Without a signature, the work may be returned to a sorting office or dispatch office at some distance from the moderator s home address and this can lead to delays in the moderation process. Candidates work should not be sent in bulky folders and it would be helpful if pages were numbered and all work was removed from plastic wallets. Candidate names and numbers must be recorded on the front of the Candidate Record Forms. It was clear that many centres carried out an effective internal standardisation process before final marks were determined. Task Choices The most frequently attempted tasks were those based on Tourism (35% of work seen), Water on the Land (32%), the Coastal Zone (12%), Changing Urban Environments (7%) and the Living World (6%). Centres successfully contextualised the chosen task so that their candidates were able to produce valid investigations. There were some instances of centres attempting a task from the incorrect year of submission or deviating from the task, although not to a degree that invalidated the work produced. The Investigations Many investigations exceeded the guidance of 2000 words and some were far too long. This was particularly evident where very able candidates had access to ICT for the majority, or all, of the time allowed for the task. Centres are advised to ensure that all of their candidates keep to the guidance of 2000 words. Shorter and more succinct pieces of work tend to be more tightly focused, they are easier for centres to mark and they are more manageable from the candidates perspective. The more concise style has led, in some instances, to higher quality writing and more attention to detail. It 4 of 11

5 has also been found that candidates who have produced excessively lengthy investigations have often disadvantaged themselves in other ways. Many of these candidates have written additional material that is not creditworthy. For example, some wrote at great length on aspects of the background of the study area that had no bearing on the issue being investigated. Moderators saw investigations that were highly organised and effectively presented. ICT access may have been a problem for some centres and some investigations contained combinations of hand written work and ICT produced material in varying combinations. This is quite understandable and perfectly acceptable. Some work was received this year on memory sticks. Centres can submit their candidates work in any appropriate format as long as the work of each candidate is readily accessible and they have the opportunity of achieving maximum marks. Work produced on posters can, at times, be rather disorganised but the best examples of investigations submitted in this manner were very good. Teacher annotations on the work indicating Levels and marks were very helpful to moderators and all centres should annotate the sample sent for moderation. Annotations must relate to the assessment criteria and simply adding ticks to the candidates work is not helpful. The Assessment Criteria Each strand of the assessment criteria has three Levels with each Level containing a number of different requirements. Candidates must fulfil all of the requirements for a particular Level before they can be awarded marks in a higher Level. It is not possible to award Level 3 marks before the candidate has met the requirements for Levels 1 and 2. Candidates may produce evidence that contributes towards the requirements of the higher Level criteria, but it is only when the lower Level requirements have been fulfilled that the higher Level evidence is considered and credited. The application of the assessment criteria, therefore, should not be seen as a best-fit model; it requires evidence of progression through the Level statements present within each strand. There is a difference in the quality of evidence required to access a Level and that required to be secure at the top of the same Level. A problem seen by moderators was where centres credited a candidate at the top of a Level when the evidence was that the candidate has only just accessed that particular Level. When this approach is used across more than one strand of the assessment criteria it leads to the centre marks being outside of the tolerance set by AQA. Geographical Understanding In the majority of cases, the investigations were well organised and underpinned by established geographical concepts that related to the taught Specification. Location evidence, whether in map form or through description, was usually very good. The location evidence should be used to fine tune the awarding of marks within a Level; it cannot be used to move a candidate into or out of a Level. To be successful in Geographical Understanding, the geographical concepts or processes underpinning the work must be identified and defined and then used accurately throughout the investigation. The assessment criteria in relation to this strand are very clear as to how this can be achieved. Level 1 explicitly requires candidates to identify and define the key geographical concepts and/or processes (key terms) that will underpin their investigation. Many candidates demonstrated this in the introduction to their investigations by making statements such as My key terms are:. and then stating and defining 4 or 5 such terms. Moderators do not need to see extensive glossaries 5 of 11

6 (the largest glossary seen contained 29 terms) or excessive coverage of established theory from textbooks. The key concepts/processes must be directly relevant to the investigation. There were many instances of candidates listing and defining terms that were claimed to be key terms ; transect, random sampling, map, secondary data, 1/3 depth, survey and perpendicular were all seen this year and credited as being appropriate by the centres concerned. Definitions of key terms were, at times, insecure. The key terms should be derived from the Task set by AQA and from the hypothesis or key question being used to focus the investigation. Once a candidate applies these key concepts/processes appropriately within the methodology, they can access Level 2. For example, if one of the key terms in a Water on the Land investigation is velocity then this would be identified and defined in the introduction and the candidate would have a method of collecting data relating to the velocity of the river, and they would use this term when describing and justifying at least one of their methods. The concepts/processes must then be applied appropriately (not simply mentioned within the text) throughout the interpretations, the conclusions and the evaluation. To gain all 12 marks in this section, the candidates must have used their key concepts/processes accurately and appropriately throughout the entire body of the work, and located their study in detail. One error made in relation to Geographical Understanding was the failure of candidates to complete the Level 1 requirements. Whilst the investigations seen were certainly geographical in terms of their content and the vocabulary used, the candidates could not earn marks above Level 1 if they failed to identify and define their key concepts/processes. Credit for general use of specialist terms is given in the Interpretation strand of the assessment criteria. There were instances of key concepts/processes being implicit within the investigations but the criteria explicitly require candidates to identify and define their key concepts/processes. In many centres candidates are encouraged to highlight each key concept/process every time they use it within the work. Then the candidates, their teachers and the moderators can clearly judge how effective they have been in applying these concepts/processes to their investigations. It was not unusual to see candidates highlighting their key terms at the beginning of their investigations but then highlight other words as the work developed. This year is was quite common for moderators to see candidates identify and define a number of key terms, thereby meeting the Level 1 requirements, but then fail to use them within their Methodology or any other part of their investigation. There were also many instances of candidates being credited for the incorrect use of one or more or their key terms. For example, tourism and tourists are not synonyms. Methodology This section was tackled well by candidates with the majority gaining marks at Level 2 or above. The specification requires candidates to use one clear hypothesis or question to focus the investigation. This allows candidates access to the full range of marks whilst producing investigations that are well organised and close to the guidance of 2000 words. There were, however, far too many instances of investigations being based upon multiple-hypotheses, or a series of sub-questions, but these investigations tended to become weak in the Interpretation criteria as candidates had too much material to process, analyse and interpret when working under high control. Once candidates had identified a question or issue, stated how the investigation was to be carried out and provided a clear description of valid data collection methods, at least one of which involved the collection of primary data, marks at Level 2 were awarded. It should be remembered that the descriptor for this strand of the criteria refers to methods, therefore more than one data collection method must be used. 6 of 11

7 The quality of the description of the methods used to collect data varied considerably. It is recommended that candidates write the descriptions of their data collection methods in more detail than the justifications. Moderators saw descriptions as basic as I asked people to complete questionnaires when the candidate could easily have included timings, locations, number of responders and the sampling process(es) used to select people to answer the questionnaire. Candidates can include annotated single copies of data collection documents, e.g. questionnaires, environmental quality scoring systems, land use classifications etc, to add clarity to their descriptions of the methods employed. Descriptions must be very clear if candidates are to be secure at the top of Level 2. One way to judge this is to assess whether the descriptions provided for the data collection methods could be followed by another candidate to replicate the task. If this could be done, then the description would be very clear and be worthy of credit at the top of Level 2. Moderators saw many poorly described methods being assessed too generously, with brief descriptions being awarded high marks. It is better to have four data collection methods, explicitly linked to the previously identified key terms and described very clearly, than it is to have eight or more methods described poorly. An approach seen within some samples was for candidates to use a sub-heading Step by step when describing each of their data collection methods. This was very effective in ensuring that the candidates wrote the description (the HOW) of each method very clearly to make top Level 2 marks secure. Level 3 marks proved to be more difficult for candidates to access and for centres to mark. Candidates cannot be credited at Level 3 without having planned part of the data collection for the investigation. The Level descriptor makes it clear that candidates must plan at least one method of data collection themselves and this must make a significant contribution to the investigation. Therefore adding a question to a class questionnaire is not sufficient to access Level 3, nor is data processing, so calculating the hydraulic radius of a river channel, for example, is not a method of data collection. Moderators do not expect to see totally original data collection techniques within all of the investigations making up a sample. Instead they expect to see a range of methods being planned by candidates from a centre and where similar, or the same, data collection methods are used by candidates, different locations, times, sampling strategies and sample sizes should be apparent. Secondary data can be collected to meet the Level 3 requirement here. Candidates cannot access Level 3 marks if they have not planned at least one of the data collection methods used, even if all of their methods are clearly justified. Where investigations are teacher directed, plans must be made for Level 3 opportunities so that candidates are not limited to the top of Level 2. Where candidates demonstrate clear evidence of the Level 3 requirements, this should be noted on the Candidate Record Form and by means of annotations in the body of the work. Moderators did see examples of data collection techniques that were bolt on extensions to the investigations and these did not always develop the original investigation or help the candidate answer the question set or reach a more informed conclusion in relation to their hypothesis. Marks can only be awarded for data collection methods that are clearly linked to the task and have provided data that have actually been used by the candidates within their investigations. Describing and justifying methods in the Methodology section does not earn credit unless there is evidence to show that data were collected by these methods, and the data were then used in the interpretation section of the investigation. Examples were seen where centres awarded marks to candidates for describing a range of data collection techniques yet no results relating to several of the stated methods were provided or interpretations given. Moderators also saw examples of candidates being awarded Level 3 marks when the teacher had clearly stated that there had been no individually planned data collection method(s) within the investigation. 7 of 11

8 The use of Methodology tables was popular again this year. Some of these were excellent and candidates were able to describe and justify their data collection methods clearly and succinctly. Where such tables include columns for evaluative comments, candidates must complete these sections under High Level Control. Candidates who leave the evaluation of their methods until the Evaluation section of their investigation avoid duplication of key points and tend to link evaluative comments about their methods and results more effectively. Teachers must not provide pre-printed Methodology tables for their candidates. Failure to include any primary data within the investigation limits candidates to Level 1 marks in this part of the assessment criteria. Data Presentation The majority of candidates were able to access Level 2, although it was not uncommon to see weaker candidates assessed harshly and stronger candidates assessed leniently within this strand. As with the other criteria, the Level 3 requirements are more challenging and many centres overmarked the work of their candidates in this section. To reach Level 3, candidates must first fulfil the requirements for Levels 1 and 2. At Level 1 the candidates have to produce a limited range of presentation techniques (they should aim to provide three different presentation skills) which can be basic and not quite complete but they must be appropriate and carry a clear message about what the data shows. Two additional presentation skills, both complete and accurate, can earn the candidate marks to the top of Level 2. Some candidates only employed one or two basic techniques, but repeated them several times over. Duplication of basic techniques gains no credit for the candidate and should be avoided unless required for comparative purposes. It was not uncommon to see incomplete and inaccurate work given undue credit. In some instances almost all of the presentation skills seen within the samples selected from a centre were incomplete, yet other centres employed strategies such as checklists with their candidates to ensure that all presentation skills were complete. Within this strand candidates are being credited for completing geographical skills, so graphs should always be complete with a title (not simply a figure number) and labels on the axes; maps should always have an appropriate title, scale (or scale reference) and a North arrow if marks at Levels 2 and 3 are to be considered. Once the requirements for Levels 1 and 2 have been met, candidates can access Level 3 by producing examples of more complex presentation techniques. These high order techniques, if completed accurately, may include; choropleth maps, scatter graphs with sufficient data plots and a line of best fit, proportional flow lines, graphs accurately located onto base maps, well annotated (not simply labelled) photographs, cross-sections drawn with due consideration to the scales used, dispersion graphs and so on. Simple graphs produced using ICT are not Level 3 presentation techniques. Moderators saw examples of land use maps, pie charts and radar graphs being incorrectly credited as Level 3 presentation skills. Some statistical techniques, with all working shown, can be Level 3 presentation skills. The use of ICT has a direct bearing on the marks awarded in this section. There must be at least one clear ICT contribution to the investigation, excluding text, if the candidate is to be awarded any marks for Presentation. Without evidence of ICT the candidate cannot be awarded any marks in this part of the assessment criteria. 8 of 11

9 Data Interpretation This section continues to be a very powerful discriminator, with progression through the Levels being determined by the key triggers of description/explanation, analysis, detailed analysis with links between data sets and valid conclusions. The main weakness seen was where candidates gave descriptions without reference to the data they had collected and without offering any reasons for these findings. Centres often over-marked these descriptive accounts of the results. Part of the Level 2 descriptor requires candidates to analyse their results by means of basic numerical data manipulation. In the best investigations the candidates described and analysed their results effectively. They organised and processed their data in such a manner that they could refer to percentages, fractions, ratios and averages whilst identifying trends, patterns and anomalies. This gave greater precision and meaning to their interpretations. They went on to provide logical explanations and demonstrate links between data sets. They reached valid conclusions, clearly based on and linked to evidence, that related to their original hypothesis or question. Where candidates did not provide a clear conclusions section for their work they often failed to secure all of the Level 2 marks in this strand and it also compromised their marks in relation to Geographical Understanding at Level 3. Centres sometimes credited candidates with Level 3 marks when the evidence for analysis was poor or missing altogether and where no links between data sets were evident. Links to the hypothesis are usually credited within conclusions. The Quality of Written Communication (QWC) was often very good with candidates expressing themselves accurately and fluently using a range of specialist terms appropriately. There were, inevitably, examples of poorly written investigations where the QWC was very weak and corrections not made to work completed using ICT, even when the spell check function would have highlighted these errors. QWC can be used to move candidates into or out of a Level but no evidence of centres taking such action was seen. Evaluation For Level 1 in this strand, candidates are required to reflect on the effectiveness of their data collection methods and suggest possible improvements or alternative methods. They should not be reflecting on hypothetical issues relating to their data collection methods, as was often the case when candidates included an Evaluation column or columns in their Methodology table, and evaluative material such as this is not creditworthy. For Level 2 they must go further by considering how specific problems relating to their methods could have impacted upon the quality of the results obtained. Then they must consider how improvements suggested for Level 1 would improve the accuracy of their results. For Level 3 candidates must discuss the likely impact of all of these issues on the validity of their conclusions. Simply inserting the word conclusions was seen by some centres as being enough for their candidates to be credited at Level 3 in this strand. Reference to specific issues and results must be considered alongside a judgement as to how these issues and results might have compromised the validity of the conclusions reached. In the best Controlled Assessments, evaluation statements were detailed and specifically related to the investigation rather than being vague and generic. Furthermore, instead of discussing the three components of the criteria separately, candidates were able to link them. They achieved this by identifying specific problems with their methods that compromised the accuracy of a particular set 9 of 11

10 of results and then impacted upon their conclusions to such an extent that they would have questionable validity. In the weaker investigations, the evaluation was either missing or covered very briefly. Here the candidates often stated what went well or, if they reflected on possible improvements, they produced a wish list of what they would like to do next time. Such statements were very basic and made no reference to results or conclusions. The key point to remember about this section is that it is an opportunity for the candidate to provide an appraisal of the effectiveness of their investigation and to suggest how improvements can be made. Simple writing frames and templates to assist candidates with this strand were evident within some samples but this is not allowed. Final Observations Many centres are enabling candidates of all abilities to produce interesting, relevant and, at times, exceptional investigations of small-scale issues. These centres are also assessing their candidates accurately using the assessment criteria. The best work came from centres where: The investigation addressed one key question or hypothesis Candidates identified and defined three or four appropriate key concepts/processes (key terms) Candidates applied these key concepts/processes explicitly and accurately throughout the investigation Data collection methods were described very clearly Presentation skills were accurate and complete An effective internal standardisation process had taken place It was evident that internal standardisation did not take place or it was not accurate and effective in some centres and this can lead to significant adjustments to the centre s marks. Most centres have become familiar with the assessment criteria and they use the statements within each Level to plan their investigations. Support material provided by AQA gives guidance in terms of structuring the investigations and clarifies issues relating to the assessment criteria and the Levels of Control involved with the Controlled Assessment. Where centres are experiencing difficulties, there is support available from Controlled Assessment Advisors and this support can be arranged by contacting the Geography subject team at AQA. There are exemplar investigations with commentaries on the AQA website and other supporting material is also available from AQA. 10 of 11

11 Mark Ranges and Award of Grades Grade boundaries and cumulative percentage grades are available on the Results Statistics page of the AQA Website. Converting Marks into UMS marks Convert raw marks into Uniform Mark Scale (UMS) marks by using the link below. UMS conversion calculator 11 of 11