The main issue I have had in working on the New Higher has been judging the level of detail to cover for each entry in the mandatory content table.
I’ve put what I currently have as key points for SDD onto a single page here (as provided by me and a few other editors of the wiki):
You can edit the key points document by logging in with the username key and the password points.
I’m certain there are areas I have too much detail for, and others that lack detail. It seems sensible to me that the best way of hitting the correct standard is for those that are teaching it to propose it!
Also, any suggestions of better/ altenative ways of sharing this (I thought about a Glow Word document as well?) would be good.
If you prefer, this Google Docs version is commentable (I’ll transfer anything over)
This was one of the main issues that was brought up at the “Understanding Standards” event I was at yesterday in Glasgow. I went hoping that we would get some good clarification on the assessment standards but the qualifications manager said “what we have is comparable with the other subjects in the technologies area” despite the principal assessor saying “it was coming in loud and clear that there wasn’t enough detail in the assessment standards”, this seems to have led to the poor results for the first year of the new course. Thus we have a situation where the people on the ground (the PA et al.) are saying that there’s a problem, and have been saying this for years yet the QM/SQA are unwilling or unable to change things.
In another thread someone posted a comparison of the assessment standards for CS, Chemistry and Physics and the level of detail is heartbreaking. Physics in particular has really good, clear details of what is in that topic and what students are expected to know about. It seems the “vauge” idea of CfE has been taken to an extreme in the technologies subject area and we’re given something like “declarative language” and supposed to know what we should teach, the level of detail and what students could be assessed on.
It’s so sad to go to these events and see things so clearly in a high level of disarray (all be it trying to desperately be covered by SQA staff joking about things or basically saying “we’re listening but it’s people above us that spoil it”) despite teachers trying to help mitigate this by highlighting the issues for the past few years. Short of a huge disaster and negative media attention (neither of which any of us want to see and work hard on a daily basis to avoid) I’m not sure what it’ll take to jolt the SQA into doing something about this at a halfway reasonable pace.
The lack of urgency to rectify these problems from the SQA is incredibly worrying. I finished my probation year in 2012 and I’ve given serious thought to re-training in biology due to the problems associated with computing. I’m currently full time in a smallish high school and we really need a part time member of staff to cover computing classes but we’re having difficulty recruiting anyone. I do wonder whether this is a computing thing or if this is happening for other subjects too?
The same queries were brought up at the Glasgow event yesterday that were brought up at other events last year and yet no one appears to be any further forward. For the lady that brought up the issue of time taken to mark coursework being well beyond our contracted hours only to be told in as many words “this is what you signed up for” was incredibly depressing.
Would the unions perhaps get involved at all? Even if this is “what we signed up for” it’s incredibly unfair that we have this huge burden of trying to decipher the assessment standards along with marking however many courseworks I have 45 kids sitting Nat5/Higher this year so that’s 90 hours I need to magic up from somewhere and others will no doubt have far more than me.
I had a colleague who was at yesterday’s event. He briefed me on the discussions and it appears nothing has moved forward since the Understanding Standards event I attended before the summer – intentionally it would seem. To suggest that we as Computing Teachers agreed to this is IMO a legally actionable statement as none of the documentation was available as anything more than incomplete drafts before the courses went ‘live’, and indeed many significant changes were made to N4/N5 and Higher Content and Assessments Packs months and even years after courses began to be implemented (it is no co-incidence that the first text-book for N5 CS was out of date before it even reached full publication). I have also never obtained evidence that CS teachers agreed to internal marking of coursework elements despite asking for that evidence. SQA representatives have merely asserted that that an agreement exists. We can equally assert that that such an agreement doesn’t exist, so it all falls back to the Working Time Agreements and the undisputable fact that we are being discriminated against.
Also, the fact that there are noticeable differences and inconsistencies in the approach to assessment standards and the level of detail provided between Computing (and Technologies in general) with the Sciences is a clear indication that the either the QM and SQA people responsible for Science or Technologies has their approach wrong – both cannot be correct, and the existance of the documents showing the disparity is enough evidence for an inquiry. Hiding under the banner of CfE (which as the days pass is becoming more and more a euphamism for abdication of responsibility by people in charge because they don’t actually have the answers to the hard questions) is a nonsense as it doesn’t explain away the disparity.
The dismissive and frankly contemptible shrug-offs should not be tolerated – they are unprofessional and (as I said earlier) potentionally actionable as the people concerned have neither the remit nor the information required to comment on our contractual and professional duties.
Furthermore, what guidance is given turns out to be even more misleading – to suggest that we should look to previous courses to work out the appropriate depth to teach new Higher CS concepts to fails to take into account that much of the content existed in the pervious courses at two levels – Compulsory and Optional Unit.
I’m sure it was said from the top table at some point yesterday that discussions at events like these do not hold any weight with the SQA when it comes to bringing about change. It was suggested it needs a group like CAS to have gathered evidence/opinion from a wide number of teachers and bring that forward to the SQA to help elicit change.
Is there anything that can be done to aid CAS in this if that is the only way the SQA want to be approached? E.g Research, surveys, emails.
Or I being naïve and regardless of what we do they have their own agenda and any changes will still be made at a snails pace?
Note we are at version 1.3 of History of changes to our Course Assessment Specification.
Version 1.1 was April 2014, 1.2 August 2014 and 1.3 June 2015. All minor changes eg version 1.2 minor clarifications to language specification (functions) on pages 6 and 7, and records on page 10; “standardised pseudocode” now referred to as ‘reference language’.
Take the Physics History of changes to Course Assessment Specification. They went to version 2.0 in April 2104 before the new higher courses had even started in June 2014 where changes included:
Page 8 –the details of the skills to be assessed have been rewritten for clarity. Page 8 onwards – Further mandatory knowledge: these tables have been revised to aid understanding.
So they knew the course content wasn’t detailed enough before they even let anyone do the Higher for the first time and made significant changes.
Version 3.0 for them came out in April 2015…
Significant changes to clarify mandatory knowledge — table revised in ‘Further mandatory information on Course coverage’ section.
Sounds like people had asked for further clarification and they gave it to them.
How does it work differently for sciences compared to technologies? Maybe it depends on who is in charge of the subject at the SQA or their connections higher up the chain?
Without doing research we might find these levels of updates and clarity might be unique to the sciences, their subjects greatly need it, or they’ve just listened to their teachers’ opinions. Who knows? I’m sure there are other subjects sitting at version 1.2 or so of their Course Assessment Specification but maybe they are fine and don’t need more clarity, or maybe they got it right in the first place.
Regardless we certainly aren’t in the right place at the moment and do require significant more clarity.
If the discussions at the Understanding Standards events have no weight then what exactly is the point of them (other than to put the SQA even further into the red?) You learn precious little from them as no-one is willing to expand on the vague information that already exists. The best you can expect is to receive updates to SQA secure documents a few days or weeks in advance of them appearing on the website. Kind of an expensive way to distribute a few bits of paper IMO.
At the previous event in Glasgow an attempt was made to deliver to the delegates some approaches to teaching new or unfamiliar aspects of Higher CS. What proceeded was a rip-off of Plan C Hub materials I had already seen and heard, including one long, drawn-out exercise that was supposed to demonstrate server-side scripting using coloured bits of paper. Once the protracted exercise had been completed, it emerged that the person responsible for creating the task was in the audience. Questions from across the floor elicited the response from him that he had tried the exercise in class and it had not gone well, taking too much time and not making much difference to the pupils’ understanding compared to the same topic delivered in a more traditional manner, and that he would not use it again in its current form.
Much derision from the assembled audience followed, understandably. Meanwhile, the SQA representation sat dispassionately at the top table, unphased as the clock ticked away….
@DCullen That’s very interesting information regarding the sciences and in particular physics. It seems clear that compared with other subjects we’ve been left out to dry with the amount of detail in the assessment standards and when I made this point to the QM on Wednesday’s event he told me (and the assembled audience) that we had the same level of detail as other subjects in the technologies area. I responded that we have a different level of detail as other subjects such as physics and I was told that I was entitled to my opinion and he quickly moved onto the next question without giving me a chance to ask further questions. I think it would be fair to say that the question section at the end of the day is potentially the most valuable/useful part of the day. The QM spent 5 minutes (I timed it) rushing through his answers to the questions that had been put on post-it notes (as he asked for) during the day. He finished and was about to rush back to his seat without any further ado when someone forced him to engage us by shouting out a question, he didn’t ask for any further questions or if anyone needed any further clarification (would you do this in your class??). He then spent just under 5 minutes (again I timed this) giving the party line for things which included telling a lady at the front that “your school signed up for this course so you’ll have to do whatever is in it in terms of marking etc.” (not his exact words but I think those who were there would agree that’s a fairly accurate description of what he said) before rushing back to his seat. I despair! I didn’t wake up on Wednesday morning thinking “yes, let’s go and give the SQA and in particular the QM a hard time today”. I woke up hoping that this would be the day that we’d get more help that would allow us to teach our pupils and give them a chance in the subject, I was expectant and hopeful for the day but was once again I was let down and left only with more frustration.
The reality is that this is actually a very serious situation. The PA reported that this year’s exam was not done well and pupils did not perform as they should. Thus pupils and parents will learn that the subject is not one that they can do as well in as other subjects. If you were in that situation what would you do? Likely you’d think twice about taking it. Thus we’ll have less kids taking CS which will add to the issues we already have with some schools and local authorities reducing it. All this against the backdrop of a skills shortage in computing and the government wanting us to be a digital nation with a vibrant digital economy. How is that going to happen when we’re bankrupting it from the start by not giving the next generation a fair chance in the subject. I am surprised that there has not been more political pressure from the Scottish Government but I guess they probably don’t know that there are these issues.
Totally agree with everything people are saying about the lack of detail as to what depth we teach. a bullet point is not enough (unless they are going start accepting a single line bullet point for answers). Perhaps everyone should start bombarding Angela Constance, the Education Minister with all the point being made here and not accepting bland responses for her civil service sidekicks, just keep bombarding the government until they make the SQA sort out this disgrace.
@Enrico Vanni I’m sorry that the understanding standards events last week wasn’t as helpful as you thought it would be but I’m not going to let you make a number of unfair assertions about the previous event you were at go unchallenged. See my replies below
Point 1. “At the previous event in Glasgow an attempt was made to deliver to the delegates some approaches to teaching new or unfamiliar aspects of Higher CS. “
This wasn’t an understanding standards event but a set of Higher CS conferences across Scotland. We were asked to provide two workshops by Education Scotland in the morning and not the SQA. Server side scripting and files, records and open data were chosen because they had appeared as areas that a large percentage of teachers were concerned about.
Point 2. “What proceeded was a rip-off of Plan C Hub materials I had already seen and heard, including one long, drawn-out exercise that was supposed to demonstrate server-side scripting using coloured bits of paper.”
The Server Side Scripting exercise was created as part of a Craft the Curriculum event and it wasn’t part of the set of activities for local hub sessions. Leads who created the activity tried it out with their local hubs in Glasgow North and South Ayrshire in order to get feedback and so improvements could be made but it wasn’t widely available or used at any other hubs across Scotland. The File Handling, Records and Open Data activity wasn’t used with any hubs so unless you’ve been a fly on the wall in Quintins office you wouldn’t have seen it.
Point 3 “Once the protracted exercise had been completed, it emerged that the person responsible for creating the task was in the audience. “
I’m sorry that you felt it took too long but that’s possibly because you were already familiar with the exercise and the key concepts. As I moved between groups it was clear that many teachers weren’t as familiar with how the different systems interact by the questions they were asking and the discussions they were having.”
Point 4 “Questions from across the floor elicited the response from him that he had tried the exercise in class and it had not gone well, taking too much time and not making much difference to the pupils’ understanding compared to the same topic delivered in a more traditional manner, and that he would not use it again in its current form.”
I was also in the audience listening to the question and answer session and that is definitely not what either Peter or Gareth said in any of their responses. They regularly contribute to Compednet so they can correct me if my recollection isn’t accurate either. There were a number of suggested improvements that people at the events made and these were included in the final version of the materials that went up on CAS Community and the Glow Compednet site. I’ve run the exercise with my own Higher pupils and it didn’t take long and it really helped them to understand the overall process. This has made more detailed practical exercises much easier for them as they understand how all of the individual pieces fit together. I’d also be extremely wary of trying to classify particular teaching techniques as “traditional” and “non-traditional” as this often ends up as a particular teachers preference.
Point 5 “Much derision from the assembled audience followed, understandably. Meanwhile, the SQA representation sat dispassionately at the top table, unphased as the clock ticked away….”
Again that’s not my recollection of how the majority of teachers responded during either of the question and answer sections. I can’t comment on the SQA section of the event as this was in the afternoon.
Having read the other comments I would say the reason other sciences don’t appear to have quite the same level of difficulties as us is because they have well established and well funded institutions such as the RSE and a number of different science subject associations to lobby on their behalf. CAS Scotland on the other hand is completely volunteer run by full-time and busy teachers who are trying to channel some of their own anger at the situation into positive action. We’re part of the BCS Academy of Computing but that only means some financial support for the conference and help with some admin and expenses. We’re therefore always limited by whether there is someone who is able to give time to help coordinate action on a particular issue and it’s always on a best effort basis.
On a positive note the research and report we compiled on coursework assessment that we shared with the SQA before the Summer appears to have caused them to start seriously looking at alternatives to the current arrangements for coursework. Collective action isn’t easy, and we may not always get exactly what we want, but it’s a much more feasible alternative than trying to make individual complaints which end up being ignored.
I agree with the majority of the points raised about the lack of detail in our mandatory content. It’s very frustrating and sometimes feels like a guessing game. That isn’t fair to us and it isn’t fair to our students either. We definitely need to band together and see if something can be done about this. I read through what Peter had to say and while the SQA might be considering alternatives to the current coursework arrangements, for me this would take higher priority.
@ Peter – it is understandable that you would want to defend your position, but your points don’t actually refute mine. The afternoon session of the event I attended took pretty much the same format as the Understanding Standards events, with an overview of the changes to the N4/5/6 Outcomes followed by issuing of exemplar answers to unit assessments that we ‘marked’ in groups. You admit also that much of the teaching material presented had been used at some Plan C hubs – I concede not all but that is mere details and doesn’t change the fact that a number of people there had experienced them before and their time might have been better spent on more pressing issues that were raised but not addressed by the top table, plus it was stated by the authors of the scripting exercise that, having piloted it, the version we used was not the one they would subsequently issue for use in class (which you have confirmed). Your differing observations of how the attendees reacted compared to mine may have had something to do with where we were sitting ;-).
My main reason for posting these observations though is that much of what went on was a distraction from the real and pressing issues we are now discussing regarding level and depth. I have heard through the grapevine what the proposed changes to courseworks might be and IMO that is fiddling while Rome burns.
Enrico, I was part of the group (of teachers and lecturers) that designed the server-side task. Unless my group in Edinburgh was being polite, they seemed to enjoy the task and it definitely triggered discussions on how much we understand about some of the content of the new course. Several revisions of this task were created based on feedback.
Much of this is down to the SQA’s approach – vague descriptors in the hope of creating a holistic, all-encompassing approach will not do at this phase in the development of the subject. Some of the content (hybrid cloud servers, digital certificates, server-side scripting to name a few) is complex and required deep understanding of multi-layered technology.
I am very keen that anyone who is able to boil these concepts down to sensible, understandable key points should do so, and that by doing this together we are much more likely to be successful. If we can agree on those points, we can build resources around a shared understanding, look for areas of miscomprehension, that lack details, and nominally work how how long to spend delivering it.
I have been working with others as a lead in Plan C for the last year and the one thing I would say more than anything is that relying on our own professional understanding and sharing it is the most likely way to produce great learning. This will take us half way across the bridge. The SQA has, I think, got to meet us there if we are to have the best new subject in CfE (the potential is there…)
“Much of this is down to the SQA’s approach – vague descriptors in the hope of creating a holistic, all-encompassing approach will not do at this phase in the development of the subject.”
I agree entirely, but unfortunately we have been told by certain SQA staff that this is all we are going to get and if we don’t like it we can lump it. I have already made the point to our QIO representative here in Glasgow that even if through “our own professional understanding and sharing” we manage to establish some sort of unofficial baseline standard all it would take is for one newly recruited assessor with little or no knowledge of what went before, or a disagreement with those unofficial standards to re-interpret the ‘bullet points’ their way and we are all back at square one (with almost certainly not the same level of publicity as this year’s maths exam). If detailed documentation exists for one CfE strand but not for another the inconsistency has to be questioned.
Interesting posts here folks, great input, thanks to all who have taken the time to comment in any way.
I think that CAS is doing great work and I think it would be fair to say that the majority of CS teachers are appreciative of what they are doing. I think some of the frustration is to do with the feeling that CAS are being used to fulfill a role that should possibly be being done by the SQA. The point about CAS being run by teachers working in their spare time is very pertinent and if family circumstances were different for me than they are at the moment I would be looking to be involved with it and put my money where my mouth. Yet another part of me feels that it’s not really fair to create a course, charge for it, have it half baked, and then ask the people that are teaching it point out the mistakes and often provide solutions on top of what they are already doing for free. We’d have been better just coming up with our own course and working on our own assessments!
For me this lies at the heart of the issue. The SQA have been told for years that there were issues here. Those responsible have done very little about them and now that the chickens are coming home to roost I feel that we need to be careful they don’t
1. Pass a load of work that should be done by them, or at least by teachers paid directly by them, onto CAS.
2. Absolve themselves of responsibility for the mess because (finally!) they are “now talking to CAS the body that represents CS teachers in Scotland”.
I don’t know for sure, and someone might want to give more detail on this, but I think I heard recently that the SQA had been fairly slow in working with CAS and thus the current “we’re working really hard to work with teachers through CAS” sounds like it might be a convenient way of those responsible at the SQA for not getting this course in a better state sooner to side step what it would seem is their responsibility.
I think the point about people collectively doing something is spot on. This is where (not withstanding my concerns raised above) I think CAS is a great body and I am very thankful to those taking their own time to be involved with it. I also think that since many of us have raised concerns individually with the SQA without much effect it would seem that doing it on behalf of organisations is a good way to try and make the point that it’s not just some random angry teacher. I know that some local authorities are looking to put together a collective letter to SQA (and possibly Scottish Government) and this would seem like a sensible thing to do at this stage and might be worth talking to LA’s about.
I think I said it earlier in this thread but despite sounding like a grumpy sod at the Glasgow event I’m really not looking just to stir up a hornet’s nest for the sake of it. I would love to have had great (or at least reasonable) CAS to work from and just to have shut up and got on with teaching the course it but that’s not the case, despite teachers making noise for the last few years that we had concerns. We’re now in a situation (as seen by the poor results from the first cohort in May) that our young people are going to be disadvantaged in our subject. That is not fair and that is not something I am happy to let slide, even for a year while things apparently get “sorted out”. From the approach and attitude from some members of the SQA it seems they are happy to “fiddle while Rome burns” but like many others who are out there at the coal face teaching this and worrying about our students I am not happy to do this.
You must be logged in to reply to this topic.