Personalized Learning and Engagement: Why Flipped Classrooms are the Future of Education

By: Jessica Rand

While the concept of the flipped classroom, and the very name itself, may seem backwards, in reality it is an innovative way of offering students and educators a more engaging, personalized learning experience. With the rapid growth of online and blended learning, and the proliferation of technology devices, educators have more opportunities than ever to provide personalized, meaningful learning experiences to their students. Using a flipped classroom model is one of the most effective strategies for achieving this.

What is a flipped classroom?

One of the biggest challenges that educators face is trying to meet curriculum or course outcomes within a short time period. As a result, many courses adopt a lecture format to instill as much knowledge as possible within the calendar constraints. While the majority of educators understand the benefits of collaboration and problem solving, they struggle to find the time. The flipped classroom model addresses this challenge by “flipping” the way courses are taught.

Flipped classrooms are not an entirely new practice. Many people have experienced some aspects of the flipped classroom model when their professors have asked them to pre-read course materials or watch a video online before coming to class. Anthony Bates explains that constructivist theories of education, argue that learners must construct personal meaning through experience and reflection. Flipped classrooms can be so much more than having students pre-read a chapter before attending a lecture. The Journal of Nutrition Education and Behaviour reports that over 20 years ago Alison King argued that educators should move from a “sage on the stage” way of teaching to being a “guide on the side”. She argued that many instructors try to impart the knowledge they have by lecturing to a room of students and would be more effective by guiding students through exploration of content. The goal of exploration and collaboration is at the center of flipped classrooms.

An instructor engaging in a flipped classroom will provide students with a means of learning course content outside of class time, followed by collaboration and learning activities in the class. For example, a student may learn course content for a science class by watching a short video and playing a relevant online educational game before each class. When the student comes to the following class, he is already familiar with the content and is able to engage in more meaningful learning through discussion, collaboration and experiential learning.

Benefits of flipping a classroom

Individual student needs are addressed more effectively in flipped classrooms. Content can be made available in different medium. For example, students who struggle with reading can listen to informational podcasts or watch videos. Rather than educators spending class time teaching content, they are able to meet with individual students, facilitate learning stations and engage in hands-on experiences with students. Students are also able to access content material at a time and place that is convenient to them. This allows students to learn in a way that meets their needs. Student autonomy over learning is critical, especially for struggling students. Learners that take ownership of their learning are more likely to engage in personalized learning.

Engagement is another advantage of flipped classrooms. Many students become bored or disengaged in traditional class settings. This occurs for two reasons. First, many students find lecture-style teaching dull and uninteresting. The Journal of Nutrition Education and Behaviour report that a student’s attention declines after just ten minutes of class lecture and that only about 20 percent of material is retained. Second, students who require accommodations or extensions will become disengaged as content material is presented in a way that is either too difficult or too easy for them. Flipped classroom educators have the opportunity to provide content in a more varied, personalized manner and provide truly meaningful, engaging experiences in the classroom.

The role of technology

As you can imagine, technology plays a vital role in flipped classrooms. An effective flipped classroom does not simply ask students to read chapters of a textbook prior to class and then expect them to have learned the content. Flipped classrooms use technology as a tool for facilitating and improving learning experiences. Great examples of technology being used effectively in flipped classrooms are podcasts, Smartboards, voice recorders, video logs, blogs, document cameras, robotics, and tablets.

Technology provides many opportunities to students learning in a flipped classroom, and the devices and programs available are becoming more exciting every day. There are, however, many terrific ways to offer students a flipped classroom learning experience without obtaining a Computer Science degree. Educators should use technology that they are familiar with and slowly expand their expertise to include more devices and programs that they feel would benefit their students.

Many post-secondary institutions are choosing to provide flipped classroom learning experiences to their students. The proven benefits to student engagement and personalization of learning indicate that students are seeing more success with this style of teaching. Educators from the University of Hong Kong provide several strategies for teachers inspired to start their own flipped classrooms. They recommend that instructors take the process one step at a time and keep technology simple (especially in the beginning), avoid providing too much information to students at once, find ways to engage students with material and reflect on practices to make changes as needed.

It is an exciting time to be in the education system. The increase of technology devices and programs has changed the way we learn and teach. As educators, we can harness the strengths of emerging technologies and practices such as the flipped classroom to engage each of our students like never before.

About the author

Jessica Rand teaches part-time in beautiful Prince George, British Columbia where her three sons keep her busy. She is currently working on an Online Learning and Teaching Diploma from Vancouver Island University and says that her hope as an educator is to use technology to engage students in their own learning journeys. When not teaching, Jessica spends time with her husband and boys, and also loves to read, run, volunteer and spend time with friends.

New member – George Abraham

By: Kelly Smith

George Abraham
George Abraham

Whenever possible, we like to welcome new SIG members by asking them to share a bit about themselves. In this issue, we welcome George K. Abraham IV.

George is a materials scientist, technical communicator, and manager of technical services at Allied High Tech Products Inc. where he has become an industry authority on metallography. He is responsible for providing technical support, seminars, workshops, training, and demonstrations on Allied’s metallographic equipment and consumables, Zeiss’s optical microscopes, cameras, and imaging software, and Mitutoyo’s hardness and microhardness testers. He has a Bachelor of Science in materials science and engineering from Case Western Reserve University, and he previously held positions at H.C. Starck and Rhenium Alloys Inc.

George manages Allied’s applications laboratory, overseeing the development of metallographic procedures and assisting with research and development of new metallographic equipment, accessories, and consumables. He has authored numerous application notes, reports, technical bulletins, operation manuals, technical articles, papers, presentations, and webinars.

George serves on the editorial board of the journal Metallography, Microstructure, and Analysis, as secretary of the International Metallographic Society Board of Directors, and as a member of various professional committees focused on standards, education, and mentoring.

Also appreciative of the art of metallography, George has been known to get lost in microscopes exploring the beauty of materials; his favorite microstructure is nodular cast iron. George has developed and taught materials sample preparation seminars for ten years and enjoys mentoring emerging professionals in science and engineering.

Welcome, George!


If you are a new member and would like to submit a bio, please email it to newsletter@stcidlsig.org!

Are VARK and Other Learning Styles a Legitimate Pedagogical Theory?

By: Noah Page

The growing case against learning styles

As technical communicators, we are responsible for understanding our audience to the very best of our ability so that we can provide the most effective and accessible documentation possible. While rich media content grants us the ability to adapt our content according to our audience’s preferred method of learning, is it really best practice to categorize our audience into supposed learning styles? With this question in mind, technical communicators certainly have a stake in the evolving debate about learning styles.

While certain educational institutions place a high value on accommodating a variety of learning styles, many publications in recent years have begun to question the legitimacy of fixed learning styles. Writing for The Atlantic, Olga Khazan reported that 90% of teachers in a number of countries throughout the world believed in learning style theories. However, Khazan also detailed a growing body of evidence suggesting that learning style theories are not scientifically sound. In Scientific American, Cindi May performed a brief meta-analysis on recent studies about learning styles. May concluded that while students have clear preferences in how educational content is delivered to them, these preferences did not dictate how well they performed even if the material was not delivered in their preferred style.

The history of VARK

According to The Encyclopedia of the Mind, theories on learning styles first began to emerge in the 1950s. Since then, five major learning style frameworks have been developed:

Visual/ Auditory/ Reading/ Kinesthetic

  • Converging/Diverging
  • Serialist/Holist
  • Verbalize/Visualize
  • Field Dependent/Field Independent

Visual/Auditory/Reading/Kinesthetic (VARK) has arguably become the most prominent learning style framework. Khazan explains that VARK was devised in the early 1990s by school inspector Neil Fleming in response to his observations that different classrooms had different educational outcomes based on how the teacher presented the material.

Fleming went on to describe the four major learning styles that would come to the encompass the VARK framework. In his 1995 article “I’m different; not dumb: Modes of presentation (V.A.R.K.) in the tertiary classroom,” Fleming sketches out the basic principles behind the VARK approach, claiming that certain students were “advantaged or disadvantaged” by certain course materials as selected by instructors. Fleming goes on to define the four major learning styles based on anecdotal observations he made of students. Technical communicators will likely be interested in Fleming’s analysis of how different audiences respond to different material. However, the problem with Fleming’s article is that he provides no hard data demonstrating how learning outcomes might be different based on what type of material was presented.

In the face of these questions about VARK’s legitimacy as a scientifically grounded methodology, the official VARK website is surprisingly defensive. A page titled “Using VARK in research” argues that, “Any hypothesis that attempts to find links, especially correlation significance, between VARK and academic success will be invalid and a waste of research time and money. Academic success, is, of course, poorly defined…” Without any verifiable data to confirm or merely imply that using VARK in the classroom truly leads to better learning outcomes, why should technical communicators even bother trying to develop content that appeals to the four VARK categories?

More evidence against VARK

Many research studies conducted over the past few years have further diminished VARK as legitimate educational framework. A 2017 study by Martha Carr and Donggun An concluded that theories about learning styles such as VARK were not scientifically sound. Though learning style theories classify students’ educational preferences, they subsequently fail to build a solid empirical framework capable of explaining why and how students respond to their specific education preferences. Additionally, Carr and An argue that most learning style theories provide no reliable or consistent methods of gauging student success.

Philip Newton provides a critique similar to Carr and An’s. Newton believes that assigning students a fixed learning style is akin to confirmation bias, because once an instructor has decided that a student is a visual or auditory learner, the instructor simply can’t see that student learning in any other style. This is obviously a detrimental way of approaching education, as it vastly reduces the number of approaches the instructor can use while he or she is teaching. Newton also claims that this confirmation bias can be harmful to students as well, because students who have been labelled as auditory learners may never attempt to branch out into learning through other styles.

Approaching a mixed-style framework

It seems that VARK learning styles do not have any convincing empirical evidence to support their claims. With this in mind, how should technical communicators approach the design of educational content for platforms such as eLearning, online FAQs, or conventional procedural documentation? One possible approach would be offering a mixed learning style approach whenever possible. In a study of undergraduate nursing students, Sandra Fleming and her colleagues discovered that 53% of nursing students polled had no preferred learning style, while 35% claimed to have “dual learning” style. Additionally, Fleming et al argued that though students might have a favored learning style, they do not necessarily learn best using that style. From these conclusions, it is clear that any content technical communicators design must address a wide variety of learning styles. Because digital platforms offer a wealth of learning options, we should not pigeonhole our audience into any one learning style.

While there may be concrete patterns in how certain types of students learn, it seems there is currently no hard scientific evidence that proves what these learning styles are. Additionally, it is hard to tell if focusing on a student’s perceived learning style truly improves educational outcomes. Because VARK lacks this evidence, technical communicators clearly cannot rely on its framework to design our content. We also cannot completely dismiss learning theories, as they do offer a good point from where we can begin thinking about our multimedia content; however, we need to remain skeptical and never force our audience into one learning style.

About the author

Noah Page is a technical writing student at Seneca College of Applied Arts and Technology in Toronto, Canada and holds an MA in creative writing from the University of New Brunswick. Page has published or forthcoming work in Plenitude, Viator, Five2One, UNB’s Journal of Student Writing, filling Station, Existere, and The City Series: Fredericton chapbook. Page also reads submissions for The Fiddlehead literary magazine.

You Had Me at Hello: Communicating Information in Government Proposals

By David Dick, STC Fellow and CPTC Certificant

After thirty-five years employed as a technical writer, I accepted an opportunity to work as a proposal writer. It seemed a logical choice for me because a proposal writer and a technical writer are somewhat alike. They start with the same subject matter to cover, but that is where the similarity ends. A technical writer will produce a clear and accurate description of a product or service. A proposal writer will produce a rationale based on the subject matter, from the customer’s perspective, why a company’s product or service is the best choice of all the rest.

What is a Request for Proposal?

According to http://www.usa.gov, each year, the U.S. federal government awards hundreds of billions of dollars in federal contracts to businesses to meet the needs of federal agencies and the military, which makes the U.S. federal government the largest employer and consumer of technology and services in the United States. In the Washington D.C. metro area, hundreds of companies compete for those government contracts, which is why the federal government is selective on who it awards contracts.

The U.S. federal government issues a request for proposal whenever a government agency needs a product or service such as replacing legacy hardware, managing a data center, migrating applications to a cloud, or providing help desk services. The request for proposal includes instructions on how to prepare the proposal for compliance, requirements for the product or service, criteria for evaluation, and submission date. The request for proposal has four parts: (1) an introduction to the enterprise and the business problem, (2) technical and business questions, (3) vendor and pricing information, (4) the schedule and process for selection. The goal of the offeror (i.e., the company replying to the request for proposal) is to deliver a proposal (a response) that answers technical and business questions, represents low-risk, and presents a cost-effective solution.

What is a Proposal?

A proposal is a sales document written in response to a request for proposal. A proposal positions what an offeror has as a solution to a business problem and helps to justify a price —even if it is slightly higher than a competitor — by showing that the offeror can provide superior value. A proposal allows the federal government to accomplish the following:

  • Compare vendors, offers, or prices in order to make an informed decision
  • Clarify complex information
  • Make the buying process more objective
  • Slow down the sales process
  • Solicit creative ideas, become educated, or get free consulting
    The proposal is structured according to the requirements of the request for proposal to make it easier for evaluators to find answers to the questions (listed in the request for proposal).

Characteristics of an Effective Persuasive Proposal

Answer the questions in the request for proposal.

Some companies pursue every request for proposal the federal government releases in order to win work, which creates pressure to respond to requests for proposals as quickly as possible. Their logical impulse when responding to questions in a request for proposal is to regurgitate facts that “sort of” answer the questions (i.e., bluffing), to re-use a proposal that is somewhat similar to requirements in the request for proposal, or to use boilerplate content. If the proposal team does not have the time to deliver a customer-specific and compliant proposal, it should not pursue the bid because it is a waste of time and money, and the company is not likely to win the contract. Unfortunately, proposal evaluators recognize a cut-and-paste effort when they see it.

Choose a persuasive approach to answering questions

Tom Sant, author of “Persuasive Business proposals: Writing to Win More Customers, Clients, and Contracts,” suggests a four-step approach to effective persuasive proposal:

  • The customer needs
  • The desired outcome
  • The solution
  • The evidence

Step One: the customer’s needs. The initial step of writing a persuasive response is to restate your understanding of the customer’s needs, issues, or problems. Summarize their business situation by focusing on the gaps to close, the competency to acquire, and the problem to solve. By focusing on the customer’s issues (called “pain points”), you communicate an understanding of the customer’s situation. As consumers, we are easily frustrated when a sales person pretends to understand what we want and chooses to focus on selling us something we do not need in order to make a sale. In government procurement, such a sales approach does not instill confidence of the buyer.

Step Two: the desired outcome. The desired outcome is to motivate the customer to consider the offeror’s product because it meets the requirements, has features that yield strong benefits, and is priced below the competition. As consumers, we always want a good product that meets our needs, offers “nice to have” features, and is within our budget. Government agencies have budgets and must procure a variety of products and services without spending more than the budget allows.

Step Three: the solution. The heart of every proposal is the solution. If the proposal has maintained the readers’ attention, the reader wants to know the solution. Proposal writers focus on the solution and avoid the technical details, jargon, and buzzwords. They provide just enough detail to communicate a well-made, reliable, product that is easy to implement, integrate, and use. Proposal writers use graphics to illustrate processes and procedures, and tables to compare benefits and features.

Step Four: the evidence. Every request for proposal has a requirement for past performance reports from two or three other government agencies as proof of competency and experience. Unsubstantiated claims detract from the substance of a proposal so proposal writers use proof points to support claims. The following are examples of proof points:

  • If the offeror sold the product to other government agencies — proposal writers list them.
  • If those government agencies were satisfied or overjoyed with the product — proposal writers include testimonials.
  • If the offeror’s product won awards — proposal writers list them.
  • If the offeror’s product was reviewed in trade magazines — proposal writers list them.
  • Providing evidence in a proposal contributes to a compelling reason to choose the offeror’s product over the competition.

Who Evaluates Proposals

The government agency that issues the request for proposal relies on a source selection team to evaluate proposals. The source selection team consists of individuals who were either involved in writing the request for proposal, subject matter experts, or those who will manage the contract when it’s awarded. The source selection team consists of four groups:

    1. The source selection authority, which makes the final decision based on best value.
    2. The source selection authority council, which provides functional area expertise to the source selection authority.
    3. The source selection evaluation board, which scores the proposal against the evaluation criteria outlined in the request for proposal. It is highly likely that members of the source selection evaluation board are the authors of the request for proposal, are technically savvy, and know what they are looking for.
    4. The procuring contract officer, who is the primary business advisor and principal source of guidance for the entire source selection.

According to proposal writing expert Tom Sant, the average proposal decision takes six minutes or 360 seconds. Six minutes to make a decision about a proposal is not a lot of time, but it is enough time for the source selection evaluation board to judge the merits of a proposal for further consideration. If the source selection evaluation board likes the proposal, it is set aside for further evaluation.

Proposals are Graded for Compliance

Due to time constraints to award contracts, the source selection evaluation board does not read proposals cover-to-cover. They skim proposals searching for key words and responses to requirements. They then grade proposals for compliance to the request for proposal, advantages and strengths, and lowest risk. The proposal with the best score and best price wins the contract. The following highlights what the source selection evaluation board looks for in a proposal:

  • A strength-based, innovative solution. There is a direct correlation between:
  • High technical ratings where the source selection evaluation board acknowledge that an offeror provides innovation, and
  • Viable solutions that exceed requirements

The greater the number of strengths identified in the proposal, the easier it is for the source selection evaluation board to justify best value.

  • Extensive experience and strong program understanding. Technical solutions with high customer ratings receive strength comments pertaining to customer-focused solutions, evidence of capability to execute the contract, and strong examples and success stories that resonated with the source selection evaluation board.
  • Demonstrated ability. When the source selection evaluation board identifies a management or staffing solution as a strength, the corresponding comments will note a succinct approach with proof points to demonstrate success and examples that resonated with the source selection evaluation board.
  • Qualified key personnel. When the source selection evaluation board favorably rates a staffing model, the corresponding comments identify key personnel that exceed qualifications, highlight demonstrated ability using the proposed solution(s) to achieve program success, and provide specific examples and metrics that resonate with the source selection evaluation board.

The following is a sample of the rating criteria the source selection evaluation board uses to grade proposals.

Description Rating
The proposal meets all solicitation requirements, demonstrates a good understanding of the requirements, and has features that offer some advantage to the government. Advantages and strengths generally outweigh any disadvantages and weaknesses. Good probability of success with very low degree of unsuccessful performance. Superior
The proposal meets basic solicitation requirements and demonstrates an adequate understanding of the requirements but does not offer significant advantages to the government over basic request for proposal requirements.

Disadvantages and weaknesses are not significant unless significant advantages outweigh significant disadvantages. Where there are areas of concern, clarifications given by the offeror were acceptable. Reasonable probability of success with low degree of risk of unsuccessful performance.

Acceptable
The proposal does not clearly meet all requirements nor does it demonstrate an adequate approach and understanding of the requirements. The proposal has one or more weaknesses that may require correction. Some areas of concern may not have been fully addressed by offeror, leaving some ambiguities. Risk of unsuccessful performance is moderate. Marginal
The proposal does not meet requirements and contains one or more significant deficiencies. Risk of unsuccessful performance is high. Proposal is not awardable without being rewritten. Unacceptable

Final Thoughts

A poorly written, company-focused, non-compliant and unresponsive proposal is easy to write; however, it will not bring in business or contribute to an offeror’s reputation.

Writing a well-written, customer-focused, compliant, and responsive proposal is a skill. Every piece of information in the proposal must be meaningful and support the rationale for selection. A well-written proposal communicates what the evaluator wants and needs to know, and communicates the benefits your solution offers over the competition. It makes it easy for evaluators to compare to their request for proposal requirements. It has them at “hello.”

About the Author

David is a Proposal Writer at General Dynamics Information Technology. He is an STC Fellow, CPTC Certificant, member of the Association of Proposal Management Professionals (APMP) and is studying for his APMP Foundation Certification.

2018 IDL SIG Demographic Survey – Taking the Pulse of Membership

By Jamye Sagan

In December 2018 we conducted our biennial demographic survey. Out of 567 IDL SIG members, 60 had completed the survey – about 10.6% of our membership base. These survey results will help our SIG leadership team continue to improve the services we provide to our members. In the next few weeks, we will publish the final survey results, along with in-depth analysis, on the IDL SIG website.

If you took the survey and indicated that you had an interest in volunteering, you will hear from a member of our leadership team in the next few weeks. We appreciate your willingness to consider offering your services to our SIG! Even if you indicated “no” to volunteering right now, you can still do so in the future. Please visit http://www.stcidlsig.org/about-idl-sig/volunteer-opportunities/ for a partial list of opportunities.

Raffle winners!

Congratulations to raffle winners Rachel Konkle and Rick Morris! Rachel and Rick each will receive a $50 Amazon gift card via email. We will send their prizes in the next several days.

Finally, we thank all survey respondents for taking the time to provide feedback to us!