Are VARK and Other Learning Styles a Legitimate Pedagogical Theory?

By: Noah Page

The growing case against learning styles

As technical communicators, we are responsible for understanding our audience to the very best of our ability so that we can provide the most effective and accessible documentation possible. While rich media content grants us the ability to adapt our content according to our audience’s preferred method of learning, is it really best practice to categorize our audience into supposed learning styles? With this question in mind, technical communicators certainly have a stake in the evolving debate about learning styles.

While certain educational institutions place a high value on accommodating a variety of learning styles, many publications in recent years have begun to question the legitimacy of fixed learning styles. Writing for The Atlantic, Olga Khazan reported that 90% of teachers in a number of countries throughout the world believed in learning style theories. However, Khazan also detailed a growing body of evidence suggesting that learning style theories are not scientifically sound. In Scientific American, Cindi May performed a brief meta-analysis on recent studies about learning styles. May concluded that while students have clear preferences in how educational content is delivered to them, these preferences did not dictate how well they performed even if the material was not delivered in their preferred style.

The history of VARK

According to The Encyclopedia of the Mind, theories on learning styles first began to emerge in the 1950s. Since then, five major learning style frameworks have been developed:

Visual/ Auditory/ Reading/ Kinesthetic

  • Converging/Diverging
  • Serialist/Holist
  • Verbalize/Visualize
  • Field Dependent/Field Independent

Visual/Auditory/Reading/Kinesthetic (VARK) has arguably become the most prominent learning style framework. Khazan explains that VARK was devised in the early 1990s by school inspector Neil Fleming in response to his observations that different classrooms had different educational outcomes based on how the teacher presented the material.

Fleming went on to describe the four major learning styles that would come to the encompass the VARK framework. In his 1995 article “I’m different; not dumb: Modes of presentation (V.A.R.K.) in the tertiary classroom,” Fleming sketches out the basic principles behind the VARK approach, claiming that certain students were “advantaged or disadvantaged” by certain course materials as selected by instructors. Fleming goes on to define the four major learning styles based on anecdotal observations he made of students. Technical communicators will likely be interested in Fleming’s analysis of how different audiences respond to different material. However, the problem with Fleming’s article is that he provides no hard data demonstrating how learning outcomes might be different based on what type of material was presented.

In the face of these questions about VARK’s legitimacy as a scientifically grounded methodology, the official VARK website is surprisingly defensive. A page titled “Using VARK in research” argues that, “Any hypothesis that attempts to find links, especially correlation significance, between VARK and academic success will be invalid and a waste of research time and money. Academic success, is, of course, poorly defined…” Without any verifiable data to confirm or merely imply that using VARK in the classroom truly leads to better learning outcomes, why should technical communicators even bother trying to develop content that appeals to the four VARK categories?

More evidence against VARK

Many research studies conducted over the past few years have further diminished VARK as legitimate educational framework. A 2017 study by Martha Carr and Donggun An concluded that theories about learning styles such as VARK were not scientifically sound. Though learning style theories classify students’ educational preferences, they subsequently fail to build a solid empirical framework capable of explaining why and how students respond to their specific education preferences. Additionally, Carr and An argue that most learning style theories provide no reliable or consistent methods of gauging student success.

Philip Newton provides a critique similar to Carr and An’s. Newton believes that assigning students a fixed learning style is akin to confirmation bias, because once an instructor has decided that a student is a visual or auditory learner, the instructor simply can’t see that student learning in any other style. This is obviously a detrimental way of approaching education, as it vastly reduces the number of approaches the instructor can use while he or she is teaching. Newton also claims that this confirmation bias can be harmful to students as well, because students who have been labelled as auditory learners may never attempt to branch out into learning through other styles.

Approaching a mixed-style framework

It seems that VARK learning styles do not have any convincing empirical evidence to support their claims. With this in mind, how should technical communicators approach the design of educational content for platforms such as eLearning, online FAQs, or conventional procedural documentation? One possible approach would be offering a mixed learning style approach whenever possible. In a study of undergraduate nursing students, Sandra Fleming and her colleagues discovered that 53% of nursing students polled had no preferred learning style, while 35% claimed to have “dual learning” style. Additionally, Fleming et al argued that though students might have a favored learning style, they do not necessarily learn best using that style. From these conclusions, it is clear that any content technical communicators design must address a wide variety of learning styles. Because digital platforms offer a wealth of learning options, we should not pigeonhole our audience into any one learning style.

While there may be concrete patterns in how certain types of students learn, it seems there is currently no hard scientific evidence that proves what these learning styles are. Additionally, it is hard to tell if focusing on a student’s perceived learning style truly improves educational outcomes. Because VARK lacks this evidence, technical communicators clearly cannot rely on its framework to design our content. We also cannot completely dismiss learning theories, as they do offer a good point from where we can begin thinking about our multimedia content; however, we need to remain skeptical and never force our audience into one learning style.

About the author

Noah Page is a technical writing student at Seneca College of Applied Arts and Technology in Toronto, Canada and holds an MA in creative writing from the University of New Brunswick. Page has published or forthcoming work in Plenitude, Viator, Five2One, UNB’s Journal of Student Writing, filling Station, Existere, and The City Series: Fredericton chapbook. Page also reads submissions for The Fiddlehead literary magazine.

You Had Me at Hello: Communicating Information in Government Proposals

By David Dick, STC Fellow and CPTC Certificant

After thirty-five years employed as a technical writer, I accepted an opportunity to work as a proposal writer. It seemed a logical choice for me because a proposal writer and a technical writer are somewhat alike. They start with the same subject matter to cover, but that is where the similarity ends. A technical writer will produce a clear and accurate description of a product or service. A proposal writer will produce a rationale based on the subject matter, from the customer’s perspective, why a company’s product or service is the best choice of all the rest.

What is a Request for Proposal?

According to http://www.usa.gov, each year, the U.S. federal government awards hundreds of billions of dollars in federal contracts to businesses to meet the needs of federal agencies and the military, which makes the U.S. federal government the largest employer and consumer of technology and services in the United States. In the Washington D.C. metro area, hundreds of companies compete for those government contracts, which is why the federal government is selective on who it awards contracts.

The U.S. federal government issues a request for proposal whenever a government agency needs a product or service such as replacing legacy hardware, managing a data center, migrating applications to a cloud, or providing help desk services. The request for proposal includes instructions on how to prepare the proposal for compliance, requirements for the product or service, criteria for evaluation, and submission date. The request for proposal has four parts: (1) an introduction to the enterprise and the business problem, (2) technical and business questions, (3) vendor and pricing information, (4) the schedule and process for selection. The goal of the offeror (i.e., the company replying to the request for proposal) is to deliver a proposal (a response) that answers technical and business questions, represents low-risk, and presents a cost-effective solution.

What is a Proposal?

A proposal is a sales document written in response to a request for proposal. A proposal positions what an offeror has as a solution to a business problem and helps to justify a price —even if it is slightly higher than a competitor — by showing that the offeror can provide superior value. A proposal allows the federal government to accomplish the following:

  • Compare vendors, offers, or prices in order to make an informed decision
  • Clarify complex information
  • Make the buying process more objective
  • Slow down the sales process
  • Solicit creative ideas, become educated, or get free consulting
    The proposal is structured according to the requirements of the request for proposal to make it easier for evaluators to find answers to the questions (listed in the request for proposal).

Characteristics of an Effective Persuasive Proposal

Answer the questions in the request for proposal.

Some companies pursue every request for proposal the federal government releases in order to win work, which creates pressure to respond to requests for proposals as quickly as possible. Their logical impulse when responding to questions in a request for proposal is to regurgitate facts that “sort of” answer the questions (i.e., bluffing), to re-use a proposal that is somewhat similar to requirements in the request for proposal, or to use boilerplate content. If the proposal team does not have the time to deliver a customer-specific and compliant proposal, it should not pursue the bid because it is a waste of time and money, and the company is not likely to win the contract. Unfortunately, proposal evaluators recognize a cut-and-paste effort when they see it.

Choose a persuasive approach to answering questions

Tom Sant, author of “Persuasive Business proposals: Writing to Win More Customers, Clients, and Contracts,” suggests a four-step approach to effective persuasive proposal:

  • The customer needs
  • The desired outcome
  • The solution
  • The evidence

Step One: the customer’s needs. The initial step of writing a persuasive response is to restate your understanding of the customer’s needs, issues, or problems. Summarize their business situation by focusing on the gaps to close, the competency to acquire, and the problem to solve. By focusing on the customer’s issues (called “pain points”), you communicate an understanding of the customer’s situation. As consumers, we are easily frustrated when a sales person pretends to understand what we want and chooses to focus on selling us something we do not need in order to make a sale. In government procurement, such a sales approach does not instill confidence of the buyer.

Step Two: the desired outcome. The desired outcome is to motivate the customer to consider the offeror’s product because it meets the requirements, has features that yield strong benefits, and is priced below the competition. As consumers, we always want a good product that meets our needs, offers “nice to have” features, and is within our budget. Government agencies have budgets and must procure a variety of products and services without spending more than the budget allows.

Step Three: the solution. The heart of every proposal is the solution. If the proposal has maintained the readers’ attention, the reader wants to know the solution. Proposal writers focus on the solution and avoid the technical details, jargon, and buzzwords. They provide just enough detail to communicate a well-made, reliable, product that is easy to implement, integrate, and use. Proposal writers use graphics to illustrate processes and procedures, and tables to compare benefits and features.

Step Four: the evidence. Every request for proposal has a requirement for past performance reports from two or three other government agencies as proof of competency and experience. Unsubstantiated claims detract from the substance of a proposal so proposal writers use proof points to support claims. The following are examples of proof points:

  • If the offeror sold the product to other government agencies — proposal writers list them.
  • If those government agencies were satisfied or overjoyed with the product — proposal writers include testimonials.
  • If the offeror’s product won awards — proposal writers list them.
  • If the offeror’s product was reviewed in trade magazines — proposal writers list them.
  • Providing evidence in a proposal contributes to a compelling reason to choose the offeror’s product over the competition.

Who Evaluates Proposals

The government agency that issues the request for proposal relies on a source selection team to evaluate proposals. The source selection team consists of individuals who were either involved in writing the request for proposal, subject matter experts, or those who will manage the contract when it’s awarded. The source selection team consists of four groups:

    1. The source selection authority, which makes the final decision based on best value.
    2. The source selection authority council, which provides functional area expertise to the source selection authority.
    3. The source selection evaluation board, which scores the proposal against the evaluation criteria outlined in the request for proposal. It is highly likely that members of the source selection evaluation board are the authors of the request for proposal, are technically savvy, and know what they are looking for.
    4. The procuring contract officer, who is the primary business advisor and principal source of guidance for the entire source selection.

According to proposal writing expert Tom Sant, the average proposal decision takes six minutes or 360 seconds. Six minutes to make a decision about a proposal is not a lot of time, but it is enough time for the source selection evaluation board to judge the merits of a proposal for further consideration. If the source selection evaluation board likes the proposal, it is set aside for further evaluation.

Proposals are Graded for Compliance

Due to time constraints to award contracts, the source selection evaluation board does not read proposals cover-to-cover. They skim proposals searching for key words and responses to requirements. They then grade proposals for compliance to the request for proposal, advantages and strengths, and lowest risk. The proposal with the best score and best price wins the contract. The following highlights what the source selection evaluation board looks for in a proposal:

  • A strength-based, innovative solution. There is a direct correlation between:
  • High technical ratings where the source selection evaluation board acknowledge that an offeror provides innovation, and
  • Viable solutions that exceed requirements

The greater the number of strengths identified in the proposal, the easier it is for the source selection evaluation board to justify best value.

  • Extensive experience and strong program understanding. Technical solutions with high customer ratings receive strength comments pertaining to customer-focused solutions, evidence of capability to execute the contract, and strong examples and success stories that resonated with the source selection evaluation board.
  • Demonstrated ability. When the source selection evaluation board identifies a management or staffing solution as a strength, the corresponding comments will note a succinct approach with proof points to demonstrate success and examples that resonated with the source selection evaluation board.
  • Qualified key personnel. When the source selection evaluation board favorably rates a staffing model, the corresponding comments identify key personnel that exceed qualifications, highlight demonstrated ability using the proposed solution(s) to achieve program success, and provide specific examples and metrics that resonate with the source selection evaluation board.

The following is a sample of the rating criteria the source selection evaluation board uses to grade proposals.

Description Rating
The proposal meets all solicitation requirements, demonstrates a good understanding of the requirements, and has features that offer some advantage to the government. Advantages and strengths generally outweigh any disadvantages and weaknesses. Good probability of success with very low degree of unsuccessful performance. Superior
The proposal meets basic solicitation requirements and demonstrates an adequate understanding of the requirements but does not offer significant advantages to the government over basic request for proposal requirements.

Disadvantages and weaknesses are not significant unless significant advantages outweigh significant disadvantages. Where there are areas of concern, clarifications given by the offeror were acceptable. Reasonable probability of success with low degree of risk of unsuccessful performance.

Acceptable
The proposal does not clearly meet all requirements nor does it demonstrate an adequate approach and understanding of the requirements. The proposal has one or more weaknesses that may require correction. Some areas of concern may not have been fully addressed by offeror, leaving some ambiguities. Risk of unsuccessful performance is moderate. Marginal
The proposal does not meet requirements and contains one or more significant deficiencies. Risk of unsuccessful performance is high. Proposal is not awardable without being rewritten. Unacceptable

Final Thoughts

A poorly written, company-focused, non-compliant and unresponsive proposal is easy to write; however, it will not bring in business or contribute to an offeror’s reputation.

Writing a well-written, customer-focused, compliant, and responsive proposal is a skill. Every piece of information in the proposal must be meaningful and support the rationale for selection. A well-written proposal communicates what the evaluator wants and needs to know, and communicates the benefits your solution offers over the competition. It makes it easy for evaluators to compare to their request for proposal requirements. It has them at “hello.”

About the Author

David is a Proposal Writer at General Dynamics Information Technology. He is an STC Fellow, CPTC Certificant, member of the Association of Proposal Management Professionals (APMP) and is studying for his APMP Foundation Certification.

2018 IDL SIG Demographic Survey – Taking the Pulse of Membership

By Jamye Sagan

In December 2018 we conducted our biennial demographic survey. Out of 567 IDL SIG members, 60 had completed the survey – about 10.6% of our membership base. These survey results will help our SIG leadership team continue to improve the services we provide to our members. In the next few weeks, we will publish the final survey results, along with in-depth analysis, on the IDL SIG website.

If you took the survey and indicated that you had an interest in volunteering, you will hear from a member of our leadership team in the next few weeks. We appreciate your willingness to consider offering your services to our SIG! Even if you indicated “no” to volunteering right now, you can still do so in the future. Please visit http://www.stcidlsig.org/about-idl-sig/volunteer-opportunities/ for a partial list of opportunities.

Raffle winners!

Congratulations to raffle winners Rachel Konkle and Rick Morris! Rachel and Rick each will receive a $50 Amazon gift card via email. We will send their prizes in the next several days.

Finally, we thank all survey respondents for taking the time to provide feedback to us!

From the editor

By: Kelly Smith

I’m writing this about a week before Christmas, but there is no snow on the ground here in Michigan. Still, I am surrounded by the sights and sounds of the holidays, from wrapping paper and labels, to candy, nuts, and way too many cookies.

So, here! Grab a cookie and some cocoa and read on to see what our SIG has been up to these past three months and what we have in store for 2019. A lot of changes are headed our way!

In this issue

Wearing her program manager hat, Viqui Dill provides an update on recent webinars. And remember – as a SIG member, you get free admission to all our webinars! That’s a fantastic perk of being a SIG member.

InterChange 2018 happened in October in Lowell, Massachusetts, and Viqui Dill gives us a great overview of the speakers. It sounds like a wonderful conference!

Viqui Dill tells us all about the Sixth Annual Virtual Open House.  You can read the article, or watch the video where members introduced themselves and discussed why the SIG is such a valuable resource. (Also, there are cats. I’m pretty sure the internet requires you to include at least one cat per newsletter.)

Guest writer Daniel Maddox explains How to Create Objectives for instructional design. Maddox is a former classmate of mine who recently graduated with his Master’s in Technical Communication Management from Mercer University School of Engineering.

Co-manager Viqui Dill welcomes new volunteers, thanks those who have changed positions, and lists the open positions we still need to fill. Please consider joining the leadership of your IDL SIG! Viqui will be transitioning from co-manager to programs manager.

Speaking of volunteering, Lori Meyer announces a new SIG awards program that will award two volunteers for their new or ongoing contributions to our SIG. Check out her article to see how to nominate someone.

Co-manager Lori Meyer bids farewell to that role and thanks her fellow leaders for their assistance. But she’s not leaving us! I found out today that Lori plans to step into the membership manager role, so she will still be part of our SIG’s leadership team and we welcome her expertise.

In the secretary’s column, Marcia Shannon says her goodbyes as our secretary and tells us why that is the perfect role for a new volunteer. Could that person be you? Read her article to see how you can help contribute to your SIG. Marcia is not going anywhere either. When her time as secretary ends, she will be stepping into the co-manager’s role.

Student outreach coordinator, Sylvia Miller, reminds us that the SIG maintains a list of degree programs in instructional design. Want to increase your skillset? Need to find a class or program? Check out Sylvia’s article for information.

I hope you enjoy this edition of IDeaL. If you would like to write an article or book review for us, please contact me at newsletter@nullstcidlsig.org and check out our publication policy.

I hope you have a happy and relaxing holiday season. See you next year!

List of Degree or Certificate Programs in Instructional Design

By Sylvia Miller

Are you or a colleague contemplating a degree or certificate program in instructional design? If so, you should check out our webpage that provides a long list of such programs. Each program is linked directly to the institution’s website which contains details about the program. You’ll find a variety of titles for these programs, including educational technology, instructional technology, instructional systems technology, instructional design, and more. You’ll also find that some institutions offer online-only programs, while others provide only in-person degree or certificate programs, or a mixture of the two.

Sure, you could just search the web for “instructional design,” but your search results will include descriptions of universities that offer instructional design services for professors and staff who do not develop or maintain their own online courses. So, we hope you’ll check out our Education page at http://www.stcidlsig.org/education/. And if you know of an institution that should be added to the list, please email me at sylviaamiller@nullwoh.rr.com with the name of the institution and, if possible, a link to the webpage describing the program. Also notify me if you find a broken link. Meanwhile, enjoy!