Archive for the 'Paper vs. Online Surveys' Category

Is the MOOC phenomenon being driven by the high cost of Post Graduate Programmes?

Is the MOOC phenomenon being driven by the high cost of Post Graduate Programmes?

Interesting comment from Nicola Dandridge of Universities UK at the recent ARC Conference in Bristol:  “Uptake for MOOCS so far is mainly a Post Graduate phenomenon.” My take away from her comment is that there is currently a real gap in the market being created due to the high cost of Post Graduate offerings.  What is unfortunate is that this is leaving the door open for US ‘For-Profit’ style solution for potential student’s that is now being exploited by Venture Capital led US companies. The MOOC concept in its current iteration seems harmless enough – however, issues arise when the Venture Capital financiers claw back their investment forcing a potential move towards the flawed Online ‘for profit’ approach that is so common in the US Higher Education landscape.

The Dark Side of ‘For-Profit’ Online Programmes: The reality – In the US – Over half of students drop out from fee based online programmes.  Unfortunately these students are typically the least able to afford the burden of the lifelong debt they are forced to payback. Of further concern – as these programmes proliferate, is the limited visibility as to the quality of teaching and learning based upon student feedback.

The Open University led ‘FutureLearn’ project recently launched in the UK looks like a promising response to the Venture Capital Led approach coming out of Silicon Valley.  UK universities in online launch to challenge US  In my view – it is much better to look to online courses as a way towards innovation in meeting the needs of students rather than a pathway to profit for the boardroom. The key point to highlight is the importance of putting concerted thought and effort towards prioritising student retention and the implementation of quality systems for enhancing teaching and learning.

6 Key Things to Consider When Managing Paper Based Course Evaluations

Looking back at this posting from 2010 – I wanted to highlight the key reasons why even close to three years on – these points are still very relevant.

  • Students haven’t changed: Ask any academic – as soon as student leaves the class – the chance of getting them to respond to a survey is low. In class surveys are the best way to capture a high response rate.
  • Using the right survey methodology: Course and module evaluations (50 students or less) requires a high response rate in order to avoid ‘non response bias’. In this scenario – paper is best. Whereas with 0verall satisfaction surveys, where thousands of students are responding – gaining a high response rate is not as important. In this instance the online survey approach is best. This is especially important if you are looking to identify the specific programme the student is on.
  • Combating the threat from social media: With the advent of Twitter and sites such as ratemyprofessor.com, it is more important than ever for universities to protect the reputation of their academics with robust evidence based course evaluation based upon valid results. If students do not believe the institution is interested in their feedback and transparent in implementing changes – then there is real threat that their opinions enter the public domain through social media. Universities must take a lead in protecting their reputations and becoming the trusted source for their students regarding improving the quality of teaching and learning.

————————————————————————————————————————————————————————————————————————

6 Key Things to Consider When Managing Paper Based Course Evaluations

Many higher and further education organisations are unable to provide computers in every class or provide online surveys in computer labs for logistical reasons. For that reason paper based ‘in-class’ surveys are the best option in driving response rates. For organisations choosing paper surveys there are some key points to consider.

1 – Gain stakeholder buy-in as the key incentive to participate.
The best way to gain active participation in the survey is to clearly articulate the vision and purpose of the survey and what actions will result. It is important to communicate this from the executive of the institution to both the students and of equal importance, to the instructors.
For students: They need to know that their participation is a key element of the overall quality enhancement effort and that decisions and actions will take place based upon the outcomes.
For Instructors: In paper survey administrations, the instructor’s play a big part in administering the survey to the students. If the Instructors are suspicious of the process or feel that the responses will be used against them, they have the opportunity to influence the results. Instructors need to understand that the results will not be used against them, moreover, results will be used to help instructors improve their teaching. If this message is clarified and supported by the executive vision statement and throughout the organisation, they will participate and assist in the process.

2 – Create surveys that make sense and report on what you need to know.
A well designed questionnaire that has clearly defined question group objectives and valid question content helps the student engage and provide the best feedback. In addition, the questions need to make sense and follow through to provide the key indicators that are unique to each institution’s reporting requirements. Unique to paper survey administration is the need to reduce the number of pages of a questionnaire in order to minimise the impact on the administrative staff tasked with scanning through results. Reducing the length of the questionnaire also makes it easier for the students to complete and again provides the best feedback results. Remember to avoid closed (yes/no) questions and create as many open ended scale questions as you can in order to gain a deeper understanding of the students attitudes and perceptions. Do not rely too heavily on open comments as they are difficult to quantify and students lose interest if asked to constantly comment.

3 – Reinforce the concept of anonymity to get honest student feedback
A good way to create stakeholder buy in from the students is to ask that the instructors read a short prepared statement that highlights the importance of the survey and the action that will be taken from the results. In addition the procedure should include the instructor leaving the room while the students complete the survey. A student representative should be asked to collect the completed surveys and deliver the surveys in the provided sealed envelope to the administration offices. By taking these additional steps the concept of anonymity is reinforced because the opportunity for instructor influence is minimised.

4 – Use in-class time
Educational institutions have an advantage in driving response rates owing to the opportunity they have to ask the survey respondents to complete the surveys during class. There are three main points to consider as to why this leads to increased survey responses.
1.Students are used to being asked to complete assignments in class so there is a natural tendency to comply with the request.
2. Administering the survey during class time will result in the students taking time in providing their feedback. Remember to remind the instructors to avoid handing out the survey at the end of class as this will lead to rushed responses and partially completed results.
3. Having student’s complete surveys in class takes advantage of the herd instinct which exerts tremendous pressure in a social collective to do what the majority of your peers do. Thus when handing out a survey in-class the natural tendency is to complete the survey.

5 – Standardising the course evaluation process will help avoid headaches later on.
It is understandable that many higher and further education organisations question the way they run their course evaluations. This tendency is influenced by the sometimes conflicting regulatory requirements of the various governing bodies. It is important not to lose sight of the value of quality enhancement and the need to drive overall internal quality improvement. The best way to achieve this is to aim towards ‘holistic’ strategic initiatives that are representative of the entire student experience. By taking a ‘holistic’ approach the executive at the institution can gain the most transparency and evaluate the outcomes strategically to make the best decisions.

6 – Survey administration should be a second job, not a day job.
It makes sense to look at the way course evaluations are run in higher and further education organisations as this process represents the best way to gain a better understanding of the perceptions and attitudes of the student. Many organisations have outdated technologies and significant efficiency gains can be made by deploying available ‘best in class’ technology An inordinate amount of energy is being wasted because of the challenges of paper based survey administration and reporting on course evaluations. This energy is better spent on driving quality to implement change and key strategic initiatives. Technology can be implemented to minimise the manual constraints and challenges surrounding both paper and online course evaluations. The decision criteria need not be centred on paper vs. Online rather on implementing technology that allows for the best use of either methodology.

ELECTRIC PAPER

Electric Paper works with universities to capture student feedback that will help to improve the future design and delivery of their courses. We work with more than 600 education institutions worldwide through web-based data capture and student survey management solutions to drive efficiency in capturing the student voice at course level. Our flagship product EvaSys Education Suite™ works to automate course and module evaluation and reporting, thereby saving staff time and costs, as well as generating cross-institution common best practice in assessing student feedback about their teaching and learning experience. In light of the need to place greater emphasis on the “student experience” in UK higher education, our role is increasingly significant. We also deliver surveys of alumni, employees and other members of the education community. For more information: www.electricpaper.co.uk

Electric Paper ensures data security whilst saving the environment through partnership with Shred-it UK.

Electric Paper Ltd., are pleased to announce the selection of Shred-it UK for secure document destruction and data protection compliance for their outsourced survey services offering. The selection of Shred-it for mobile document destruction ensures complete and secure data protection for our scanning clients.

Once materials have been shredded on-site at the Electric Paper facility, they are subsequently baled and recycled into a variety of useful paper products. This process ensures that our customers’ confidential information is always disposed of in the most secure way possible, whilst helping save the environment in the process.

“Electric Paper is proud to be part of this active recycling programme. In addition to the peace of mind as a result of the Shred-it mobile document destruction service, we are gaining a running tally of the number of trees saved per annum – 37 trees so far!” Eric Bohms MD, Electric Paper Ltd.

For more on the services provided by Shred-it, please see the link provided below:

Corporate responsibility and environmental benefits.

National College of Ireland: Streamlined quality assurance and reporting with EvaSys Survey Management

National College of Ireland

For 60 years, National College of Ireland (NCI) has been a leading provider of graduates with the skills and knowledge to meet the existing and emerging needs of the Irish economy. Evolving with the changing employment landscape, the college has built an enviable reputation for excellence in education and for designing programmes that are relevant to the workplace. Today, NCI is a Higher Education provider committed to advancing knowledge in its specialist areas of business and computing.

Background

The gathering of student feedback is essential to the quality of the learning experience for students as well as programme development. Historically, data from student evaluations was sent to an external data processing and analysis service which was expensive, inflexible and slow. The Executive Board requested that this work be brought back in house and surveys deployed online to minimise the cost of the exercise.

It was anticipated that students would respond directly to surveys through an online questionnaire. However, experience has shown that response rates for the online survey are very low, thus giving limited value to the outcome. To date, the most successful approach to operating online surveys was when Programme Co-ordinators ‘guided’ a group of students to complete the surveys in class. This method is however unsuitable for large class groups, off-campus students, community based teaching and classes not normally taught in a computer lab environment. In these situations the best method would be a paper based survey administered in class. Additionally, when using a product called Survey Monkey, the survey preparation effort required manual input of information for all modules taught on all programmes per semester. This process took approximately 16 working days to complete each year.

Summary of EvaSys Benefits

  • EvaSys allows the college to use both online and paper surveys depending on the student body thus improving response rates. Over time, it is hoped that the number of online surveys will outweigh the number of paper surveys which will further reduce costs.
  • It is now possible to import module information that is currently held in the student record system, thus reducing the survey preparation time to less than 1 day.
  • EvaSys makes use of module information and structure from the student record system in order to streamline the creation of surveys and reports.
  • There has been a significant improvement of the data capture processing by using distributed scanning instead of manual data entry, making better use of staffing resources.
  • The solution allows for almost immediate analysis based on pre-developed standard survey reports that have been optimised for the colleges use.
  • Deans of School, individual lecturers can immediately receive survey feedback reports via email and or staff portals. This also facilitates feedback to those working away from the main campus. It is planned to publish these on student portals in the next phase of the project.
  • Feedback can also be provided swiftly to stakeholders such as corporate and government clients.
  • With the introduction of modularisation, comparison of feedback from different cohorts of students and/or programmes is more easily available.
  • Trend analysis is also possible over different time periods.
  • The survey system can be used in several other areas of the college with little additional cost. (marketing, student services, HR etc.).

“We were really surprised by how much work was involved in using a generic survey tool like Survey Monkey. Not only was the response rate low – but we also had to create a unique survey and report for each module, taking more than 8 days a semester. Now we have a highly efficient hybrid system with in-built reporting. EvaSys provides us with the time to focus on best practice and policy for assuring quality.”

Sinéad O’Sullivan: Director of Quality Assurance & Statistical Services.  National College of Ireland

Is distance learning the problem?

I was startled by a statistic highlighted in the recent article in the Times Higher Education entitled ‘The revenues look good, but are the customs sound?‘ covering the US Senate committee report on the for-profit higher education industry.

‘Online distance-learning programmes, a key focus for many for-profits, have dropout rates as high as 64 per cent.’

With the proliferation of online distance learning courses across the sector and the pressure to compete – it seems the 64% drop out rate should be of equally grave concern to helicopter parents, the executive team as well as the funding bodies. This possibly says more about the distance learning approach and the detachment from the physical learning culture and supportive environment of the institutional campus. It is difficult to quality assure a programme based on a 36% retention rate. Worse still – bad loan debt becomes a heavy burden for the majority of students (many of whom are lower income in this example) that are walking away from their courses.

Is distance learning bad for HE? This should be at least considered as many institutions are looking to massively expand their online curriculum.

Swansea University: EvaSys Testimonial

Swansea University is a research-led University with over 16,000 students that has been ‘making a difference’ since 1920. The University community thrives on exploration and discovery and offers the right balance of excellent teaching and research, matched by an enviable quality of life.

In the past, module surveys had been handled at a department level in an ad hoc fashion. Various methodologies had been tried leading to a diverse set of results for use in evaluating the quality of the module as well as the overall student experience. In December of 2011, Electric Paper’s flagship product EvaSys was piloted across several departments using both the paper and online methodologies. The pilot was very successful yielding a high response rate and fast turn-around of the results. The decision was made to implement the system throughout the University for lecturer feedback as well as overall module evaluation using a set of core questions for the spring semester 2012 term.

The institution has now fine tuned the use of EvaSys implementing a hybrid of online and paper methods based upon on the requirements of each department with the goal of achieving an overall response rate of +50% . “EvaSys has enabled us for the first time to gain an overall visibility as to the quality of our teaching at the modular level, simplifying a process that had become very cumbersome and difficult to manage for all the stakeholders involved” said Alan Speight, Pro Vice Chancellor for Student Experience. “We are now looking at ways to use the results strategically to identify excellence and close the quality enhancement loop with students.”

Phil Brophy, Student Experience Strategic Projects Manager has had the responsibility of the implementation of EvaSys for module surveys. Mr Brophy stated “Electric Paper has provided excellent service and support throughout the whole process. They were extremely helpful in identifying a number of logistical issues in the evaluation cycle and were always happy to help.”

Reflections on the 2012 QAA Annual Conference

Conference Evaluation: The QAA have selected Electric Paper to provide conference evaluation services for their 2012 Annual Conference. As Sir Rodney Brook pointed out in the conference welcome: Electric Paper is “giving us a taste of our own medicine and subjecting us to our own quality review!” Of course this was a great honour for us and the feedback from the delegates confirms the excellence of the venue,speakers and appropriateness of the content covered at the event.

Interesting tid bits:

  • Move towards Risk Based Quality Audit – means students can prompt institutional review.. The detail of this is yet to be defined. In order for this to work – institutions are expected to implement clear quality enhancement systems drilling down to module level.
  • good definition of enhancement from the panel: We don’t care how good you are -the point is can you be better vs. simply meeting a defined quality threshold.
  • Great to see such a strong presence from interested student reps and active participation from delegates from the NUS!
  • AC Grayling – intelligent argumentation advocating a move towards US model of institutional endowment allowing for the best and brightest to study with the institution covering fees based on what they can pay.
  • QAA chief exec Anthony McClaran was an excellent chair for the panel session – however- He was challenged with the fact that the arguments tended towards HE sector issues and he rightly navigated discussion back to how it relates to improving and enhancing quality

Are HEI’s entering a market through KIS data sets?

http://www.guardian.co.uk/higher-education-network/blog/2012/jun/19/distinctiveness-in-higher-education?commentpage=last#end-of-comments

HEI’s are right now busy compiling their KIS data sets for submission. A challenge they face is in how to present hot topics such as contact hours. The reality is that – how to present key information sets is in fact open to interpretation and the consideration and concern about how the data looks in relation to other HEI’s ties directly into the idea of distinctiveness. Could this be an initial indication of a formation of a market? Let’s hope it does not lead us down a slippery slope to a USA free for all..

Whose survey data is it anyway?


When meeting with various HEI’s across the UK and Ireland it is clear that the way student feedback is used is intrinsically tied to the autonomous and individual character of each institution. One wonders however, how much of the reporting requirement is based upon the requests of individuals for data presentation rather than on best practice. Custom report requests from the executive team, academics or students can take on a life of their own and formulate into engrained processes. Too often, streamlining the amount of work required in collating data for various stakeholders has been low down on the list of priorities leading to repetition of work across an institution and laborious manual processes. Limitations in legacy client server technology has compounded this issue which is why given the threat of further funding cuts, it is a perfect time to highlight institutional reporting in quality assurance for process improvement.  Below I have highlighted several do’s and don’ts for consideration.

Best Practice Report Dissemination:(Click on Image)

Do consider the importance of providing relevant and timely reporting to stakeholders within the institution.

This includes the providers of services, the quality unit, the executive, academics, and the students. Evaluation reporting should be supplied to the tutor before the module ends so that they have time to discuss results with the students.

Don’t overload the various stakeholders with information that is not relevant.

The providers of services have no interest in the level of satisfaction with programmes and the students have no need to receive comments meant for the academics looking to improve the design and delivery of the course.   Invariably, reports that are not relevant lead to a need for clarification or further requests for analysis and report presentation.

Do centralise the reporting function and provide this as a service staff.

There is no way of efficiently providing timely and efficient reporting to the various stakeholders if each academic or programme leader is generating reports as separate processes.

Don’t forget the participants are stakeholders:

Whenever planning a survey project, make sure feeding back to the participants in the survey is given priority. This is especially true in regards to outcomes/actions on modules.

Should students be trusted with quality assurance?

It seems that at the centre of the recent consumer argument in higher education is a fear from certain thought leaders that a more engaged student could lead to the productisation of the sector.  Intrinsic to this position is the paternalistic idea that the student does not always know what is best and therefore cannot be entrusted with pedagogic leadership. The fact is that these supposedly disengaged students through research and peer influence do successfully seek out and identify the best universities and programmes.  This begs the question; do universities become great because of or exclusive of the active engagement and participation of their students?

In the Times Higher Education article published on 12 April, QAA’s new riff on student feedback: positive notes or waves of jargon?  Reporter Jack Grove points out that “Plans to involve students more heavily in quality assurance have been criticised for introducing unnecessary “jargon-encrusted” bureaucracy to universities.

Students are talking – are universities listening?

Institutions are very active in quality assurance efforts through the NSS and many other survey projects.  Yet, one of the biggest challenges universities face is in letting the student in on results especially at course and module level.  If the opinions expressed in the Times Higher article are an indicator, a potential reason for the lack of student engagement may have something to do a patronising view of the student rather than their desire to participate in improving the quality of their education.

Does transparency matter?

James Williams, Associate editor, Quality in Higher Education added this comment to the Times Higher article. “It is to be hoped that the code will encourage institutions to explore some of the ways in which student feedback was collected before the advent of the NSS. The most successful used instruments that were designed in consultation with the students themselves and that reflected their concerns rather than those of senior managers or statisticians. The resulting data were triangulated with other important sources of intelligence to inform (not dominate) management decision-making.”

Could it be possible that students can be trusted to actively direct their academic future? There is evidence of their abilities in this area through their use of informal channels and social media in making informed decisions as well as voicing their opinions.  Informal peer networks are a reality and simply monitoring Twitter and Facebook will not protect the reputation of academics and institutions. With this in mind it makes sense that universities take a lead role in cultivating partnership and consultation and consider the student as active stakeholder in their education.  Improving survey administration and quickly turning around survey results back to students thereby closing the feedback loop based on statistically valid evidence is a prudent way to cultivate trust, transparency, and engagement.


Enter your email address to follow this blog and receive notifications of new posts by email.

Join 330 other followers

Survey Results


Follow

Get every new post delivered to your Inbox.

Join 330 other followers