We don’t need no Stinkin’ Knowledge Test..

March 3, 2010 by Bruce Landsberg

writtenexamMea Culpa from last week: Before getting to this week’s topic. I owe you an apology for not checking Google Earth last week to see that Palo Alto airport was NOT in the midst of a housing development, relating to a Cessna 310 that crashed into a house during an IMC takeoff. Any fool can see that the airport is in a relatively open area. Appreciate the correction!

Now – Do we need a knowledge test? There has been discussion in various groups about what purpose the FAA’s knowledge test serves. That conversation has been running longer than MASH or Seinfeld reruns combined.  It came to light again as the only two testing companies, CATS and Laser Grade decided to arbitrarily raise their testing fees by $50 to comply with some new FAA requirements – AOPA appropriately objected and we’ll see where that goes.

Your thoughts would be welcomed, however, on the broader topic beyond the economics involved. The written test used to be just that – a multiple choice form that was filled out with a number 2 pencil and a separate test booklet with charts appended.  Along comes the computer and the test had to be renamed and re-engineered for electronic delivery.

I’ve written in the magazine (and engaged with some of the FAA staff) that some of the questions were designed more to play gotcha with pilots than to accurately assess their understanding of a particular concept. The interpolation of time, fuel consumption  and takeoff distances are a few areas where the degree of precision demanded by the test was overreaching when compared to the operational realities.

But do we need a test at all? Some say that the testing could be managed by designated pilot examiners (DPE) when the applicant comes in for the check ride. I’m not so sure. The population of DPEs is not entirely standardized so the “Executioner” could give a 6 hour oral while the” Santa Claus” might be happy after 20 minutes.

The alternative view is that there is a place for knowledge testing and we might acknowledge that it will not and can not assess all the areas at all the depth that a pilot should know. Do pilots and the test prep industry use rote learning and memorization? Yup and that’s no different then most other introductory learning processes.

However, to pass,  learning must take place  and standardization is essential to deal with the numbers of applicants. Knowing the basic FARs, airspace and charting  is something the test does well. Good decision making and learning the nuances of weather is much harder to assess and we should be realistic in our expectations. You probably won’t get a real view of someone’s ADM abilities until they’re in the real world and think no one is watching.

So, what are your thoughts? ASF will be having the discussion with FAA in some detail later  this spring.

Bruce Landsberg
Senior Safety Advisor, Air Safety Institute

ASI Online Safety Courses  |  ASI Safety Quiz  |  Support the AOPA Foundation

30 Responses to “We don’t need no Stinkin’ Knowledge Test..”

  1. Bruce Ziegler Says:

    My biggest beef with the written test has always been interpolating to a level of precision that most likely exceeds the data being interpolated. Write the question to test the subject matter knowledge, not math skills. As to the cost, with the overhead of dedicating a testing room and equipment, and time for a proctor perhaps the increase is not out of line. I would be surprised if the cost of the written exam is seen as a hurdle to many serious aviation students.

  2. Bruce Williams Says:

    The tests still serve a purpose that as Bruce Landsberg noted, isn’t easily handled by DPEs. But the tests need an overhaul. As others have noted, there’s too much emphasis on interpolating and calculating to silly levels of precision. Questions should emphasize essential regulations and other fundamentals that must be known. Most importantly, areas such as flight planning and navigation should be updated to reflect today’s operating environment and tools. For example, even the private pilot test should ask more questions about working with ATC, using DUATS, and the like. IFR tests should cover more GPS-related topics. And the weather sections should take into account new tools available to pilots, not just decoding the most obscure abbreviations and symbols used in reports, forecasts, and charts.

  3. Michael Stoneking Says:

    I just took the written exam for IFR before the fee increase, which is way too big of a price hike. I feel the test does not really find out what you know either. I studied long and hard and really concentrated on my weak areas and then took an online practice version from Sport’s 12 times before taking it for real. On the practice ones I always scored well and felt really confident going into the real one. I guess my test prep book was a little old (2007) since there were a lot of questions on the real test I had never even seen. It was also a lot more difficult than the practice ones and felt lucky to pass! I got hold of at least 7 or 8 “gotcha” questions with all having answers so similar I really doubted what i did and did not know! When they only ask 60 out of a bank of over 900 there is no way to get a feel for what you do or do not know. I could have gotten “lucky” and had an easy test and looked like a know it all. Oh yea about my weak areas I studied so hard for.. I only got 1 question that fit into that group and a bunch in the Regulations area that like I say were very cleverly worded! I agree you do not want the test to be one that can all be easily memorized but it needs to be a little more standard with like 75% of the questions asked in clear cut no nonsense formats and then if they want to throw in a couple “gotchas” that will not cause you to fail if you have really studied and prepared then so be it. It also sounds to me like there is some colusion going on with CATS and LASER GRADE both raising the fee the same amount at the same time!

  4. Joseph Martin Says:

    I think testing is necessary to insure pilots learn something about flying besides how to control an aircraft.
    I agree with previous commenters that there is too much ‘gotcha’ either in the math or the semantics of some of the questions I have studied. But there is plenty of important stuff a pilot should know that can be tested.
    Some more important areas are:
    - knowing airspace; it’s crowded there and requires additional awareness with a need to know what the operating parameters are,
    - communications; both with ATC (need to understand what they are saying) and proper terminology for non towered airports,
    - right of way; who has precedence over the aircraft you are flying and under what conditions.

  5. Don Arnold Says:

    I, too, studied hard for both flex and fixed wing Sport Pilot last year. But I found the actual tests too easy. What a change from the Rec Pilot test, which I took years ago. Then, there were a dozen TAF and METAR abbreviation questions. Huh? The whole Rec thing was that you could climb out to pattern altitude and see all the weather you could legally encounter.
    Now the SP tests have too many obvious human factor questions. One test had one airspace question, the other, a whopping two!! A random number generator must be picking the questions. No sentient being would have chosen fewer than 30% airspace questions, especially for SP. We’re the FNG’s who are expected to stay out of everyone else’s way. Not a lot of human factor needed with our small gas tanks and restricted altitude.

  6. Marl Says:

    I don’t think we will ever see the “written” tests disappear, but they do need to be revised to be a true test of knowledge, and not just a rite of passage.
    I think it was Einstein that said “never commit to memory what can be looked up”. I think the knowledge tests should work to that standard—test the student’s knowledge of the subject matter, but don’t include obscure bits of information that are almost never needed in the real world, or if they are, are not critical to safety and can be recalled from a manual or other resource when needed.
    Make the tests more realistic.

  7. Joshua Carroll Says:

    I took my first knowledge test for my PPL August of 2008 and I am coming up due for another one this August. I’ve very disappointed with the fee increase. I understand that increases are necessary with inflation and so forth, but $50 sounds awfully like a big increase to me plus it seems like it would deter some people from flying, not this particular fee itself, but in general, people can’t afford to be nickeled and dimed everywhere. Aviation is already an expensive field to enter.

    I also feel that the amount of questions on the test out of the total that are in the bank does not represent the amount of knowledge that a pilot should know. I passed the test with flying colors, but got caught on those “gotcha” questions as well. I feel that those questions are good in a way that pilots should pay more attention to everything that they do in that regard, but c’mon, is this really what it’s coming down too? No the questions shouldn’t be easy, but they should at least reflect more along the lines of what we need to know and to tell you the truth, I think there should be more questions on the test. I hate that I’m saying that we need more questions to think about and answer, but it’s the truth. Think about it, we’re pilots, we know this stuff, we’re still learning, we own the sky, and 60 questions doesn’t sound like enough to me for the skill of flying that we do.

  8. FIELD MOREY Says:

    As a DPE for forty plus years I remember the days when students used to buy an ACME book, memorize the questions and take a test where 80% of the questions were irrelevent or trivial, filled with pictures of instruments and radios that you never have or will see again,etc

    The only thing that has changed during that time is the use of a computer.

    Leave the knowledge test but clean up the test. get rid of the nonsense questions and make it relative.

  9. Tom Koff Says:

    The Knowledge Test is part of the process that shouldn’t be eliminated. I completely agree that some of the questions should be eliminated and others added. Who needs to know about relative bearings and ADFs in this age of GPS. It would be impossible for DPEs to effectively test the range of knowledge required to fly in this world. There is no reason on earth to dumb down the requirements for a pilot certificate or rating any further. After all, the students have access to all the questions and answers as part of the study process.

    What actually is the objection to having a knowledge test? Does it take too much time to study? Is it unfair to people who are brain dead or in a coma? Is there just too much information for one person to know? I don’t think so. Yes, some of the questions should be changed or eliminated, and the test modernized, but the test doesn’t need to be eliminated.

    On the subject of cost – I hate it when prices increase. Especially when someone gives a particularly lame excuse. However, I remember when I received my pilot certicate and the cost was about $600 for the whole thing. Now it costs over 10 times as much (closer to 15X really or $10,000). Should we limit the amount flight instructors can charge. Or more to the point, how about DPEs? $450 for a checkride seems outrageous. But when you look at it on an hourly basis, it isn’t that far out of line.

    There are no easy answers. Learning to fly in our complex environment isn’t a simple process. It requires a broad range of knowledge and the ability to apply it. Eliminating the Knowledge Test isn’t the answer to any appropriate question.

  10. Todd Carmosino Says:

    As others have commented, the knowledge tests are relevant but must be updated. Take the IFR knowledge test for example. As NextGen is implemented, how can someone’s skills be accurately measured with questions about legacy land based navigation. It’s imperative that the knowledge tests keep pace with the significant changes our National Airspace System will experience this decade.

  11. james denike Says:

    Believe knowledge tests will continue to serve a good purpose but they can neglect to spot critical areas of deficiency. If a student pilot gets a 92 on his written but doesn’t understand the relationship of wing configuration or g-load to stall speeds, he/she is on his/her way to big trouble ahead.
    ergo, the new tests need to weight certain items in such a way that such critical items as above are caught immediately and remedial training is given to correct the deficiency ASAP.
    one had an acquaintance approach me to ask whether the stall speed increased or decreased with flap retraction. Originally a C172 pilot, this same person had just purchased a Corsair fighter from a foreign countrky. Despite efforts on the part of my own boss (a Korean War pilot who flew USMC Corsairs, this private pilot got himself killed several months later.
    jjd ATP/CFII
    newtown, CT

  12. Chris Says:

    I agree about some of the math being unnecessary. It may just be to test the prospective pilot’s mathematical reasoning ability. There also seems to be a presumption that the candidate will proceed all the way to an ATP rating.

    Interpolations are usually unnecessary. Our ground instructor told us that for the test, we would need to be able to do interpolations. But we were told that in real life, we should always make sure that we were always well to the conservative side of nearest tabulated value. If the nearest tabulated value was close to the point where we were operating, it might even be a good idea to use the next tabulated value beyond that. If operating too close to the tabulated value, instead of interpolated – don’t fly – there isn’t enough conservatism. For most things, like takeoff distance, what’s the point of interpolating down to a few feet, when you should add at least 50% anyway? Use the next highest tabulated value, and then add 50% as well.

    I’ve seen questions on the instrument test, asking to compute such things arrival time, where even for instrument flying, the precision required was ridiculous. For a two hour flight, the answer options weren’t spaced at obvious intervals of something like 5 minutes apart, where it would be possible to rule out a couple of the answers. Instead, a couple of the answers were only 30 seconds apart. This required a tangled mess of numerous complex calculations such as time to climb to altitude, interpolations, time to traverse a DME arc in the departure, winds aloft, etc, etc. It was also necessary to find information from the POH and other charts. (Which at least were provided.)

    Even with an experienced competent pilot, this single question alone, could easily use up more than a half hour of the examination time. To be fair, an IFR pilot should probably be able to do these things. But this extremely complex question should have been divided up into several individual, less time consuming questions, that addressed these subjects individually. In practical use, if winds aloft were off even slightly, the pilot would probably have to file a modified ETA once enroute, anyway.

    There needs to be more differentiation in testing, between pilots flying purely for enjoyment, and those flying in more demanding commercial situations.

  13. Larry Tarr Says:

    I am definitely of the opinion that the $50 increase in testing fees is unnecessary. Upon hearing of the increase, I quickly scheduled and completed my Instrument written exam on February 28, the last day at the “old” price. In spite of my final score of 95%, several of the questions were worded so vaguely that two of three answers could have been correct. A question regarding taxi signs had a choice that was the most appropriate answer, but I scored wrong on that question. Internet research of the question codes later that day exposed a revision that made another selection the “correct” choice. Although the new wording was more precise, it was not revised on my test but the choice (A, B, or C) was modified to correspond with the new wording!

    There was also a question regarding the entry to use for a holding pattern specified according to ATC instructions. I firmly believe the type of entry I chose was correct but the question was scored as wrong. I am meeting with my CFII tomorrow and plan to ask him which entry he would have picked.

    My point here is that someone whose grade is at the borderline of 70% might fail due to poor/vague wording or a single question that does not score the correct answer. CATS and Laser Grade raised their fees (simultaneously and by the same amount, interestingly enough) but seem to pay no attention to quality control on examinations in which the outcome can dramatically affect the finances and possibly the futures of test takers.

  14. Steve Pruszynski Says:

    Having taken several Government style tests, Private Pilot and FCC Communications test in my life time I saw good, bad. incorrect and then the GOTCHA questions. I think it is necessary for some formal written testing. The test should be geared to what is necessary for safe flight operations. Some of the required knowledge is common to all flight operations and some is geared to a particular type of operations. You have to have some standardization to make sure the new pilot has been introduced to the regulations and basic concepts of flying for the type of flying that pilot will be involved. Testing however only evaluates your knowledge at that instance in time. It does not indicated the amount of knowledge you will retain or actually understand. I quess the bottom line is when the —- hits the fan it will be your pratical flying experience that will determine the outcome. Flying is one of those things that once you commit to it, you will never stop learning something new or practicing old skills. I think a testing system that evaluates the knowledge of the pilot at intervals of training is better then sitting down and taking just a single test. What is the real reason for the test, to get a grade, or train someone to be a safe aviator?

  15. Art Ahrens Says:

    The written test provides a formal way for the DPE to determine the level of knowledge that the pilot has, and can adjust his oral exam based upon those test scores. As it has been stated very well, I have a low tolerance for the GOTCHA questions, as well as do we still need to be tested on Remote Magnetic Indicators, or other navigational devices that are not even in the airplane anymore? Rumors abound that the FAA are adding questions about LORAN operations!!! (just kidding, but you get my point!) The FAA needs to focus on general operating, safety, and navigational questions that ensure that the pilot understands the material, and is worded such that the examinee still needs to read the question twice.

  16. josh morowitz Says:

    Some type of knowledge test is probably a good idea but it doesn’t have to be administered by a third party or the FAA. Let the CFI who is recommending the student for the flight test certify that the student has passed a written knowlege exam. Sure, there is an opportunity to cheat the system but I imagine that DPEs will quickly catch on if students show up for their exam without the required backround. If a CFI keeps recommending students who can’t pass the oral part of their exam then he/she should be called to task for it.

  17. Bruce Liddel Says:

    I share Larry Tarr’s frustration with CATS grading. Apparently marking all the correct responses no longer guarantees better than about a 96% score. In my own case (commercial) using the categories the test results indicated, plus my recall of the questions that had been asked, I went back and verified that on those two questions I had indeed marked the answers that Gleim had recommended, and further verified with Gleim that their recommendations are still the same. Apparently, pilots are held to a much higher standard (clairvoyance, no less!) than the testing agencies. I’d like to see a return to the paper exams, so there can be evidence of any foul play, and a mechanism to contest bad questions. And yes, the fee is way too high for the level of incompetence I encountered. I really like Josh’s idea.

    Still, I’d be hard pressed to come up with an alternative to the written exam, despite it’s many fixable flaws. We are required to learn a lot of material, and paying someone by the hour to verify that we have no gaping holes would be way more expensive, and less standardized than the written exams are.

  18. Glenn Sproat CFII Says:

    The knowledge test is definitely required! It simplifies the final exam hopefully to a few hours rather than all day..It makes the job of the DAR easier and also supplies the student with what he does (supposedly) need to know for that certificate. What needs to be simplified is weather! We no longer send weather by telegraphy. So put it in long hand english so you can read it not have to read and try to guess what the issuing station ment.
    If the student learns by rote and retains it so be it; but we have stopped most of the wings programs where the latest and greatest new knowledge can be spread so us older pilots are not necessairly keeping abreast of what is needed in the airspace system today. The new instruction by weekend ground schools do not teach the test but make it easy to see what is wanted by the FAA test standards.
    The FAA occassionally used to sit in on these schools to make sure it was not a read the 900 questions. Most schools doa very good job of teach8ing what knowledge is needed for the test and the and/ors that the FAA throws in to trick the uninformed.

  19. Joseph M. Zias Says:

    I am currently a high school teacher of physics and chemistry, and in another career was a profession engineer in the electronics industry. I am a student pilot and hope to complete my training early this summer. Knowing how well we “spoon feed” young students and how reisistant they can be to actually studying some difficult material I suggest I DO NOT want people given a pilots license that can’t pass the wirtten test.

  20. Chuck Thom Says:

    I’m CFI, and am highly in favor of the Knowledge Test, for the best reason: it forces the student to study everything, thoroughly, over a period of time suitable to the target certificate. A well rounded study program, toward learning what is known to be required for the Knowledge Test, is the most important part of the process. The test itself is much less important.

  21. Gary Eaton Says:

    Bruce,

    It appears that nearly everone agrees that we need the stinkin”
    knowledge exams.

    My complaint is that the Exam for LTA Hot Air Balloons has far too
    many questions that have no remote relevance to ballooning !

  22. JC Hyde Says:

    I agree, the test needs to be rewritten to become relevant with respect to changes in current aircraft systems and the requirements to operate/navigate within new aerospace control structures… temp flight restricted areas and the permanet flt restricted areas. One thing that could help a great deal is for the FAA to provide the answers to the test, as well as where to find the relevant data used to determine the answer. In the Air Force, the aircraft master quesion file (containing 300-400 questions) also had the answers. If you needed to understand why an answer was the correct answer, you could go to the dash-1 and read the data. Very helpful in self study, and providing preparation for the oral examination.

  23. Jim Says:

    All governmental test that I have taken opt for the obscure, not the important information. The tests should- and to a point, do- test knowledge that the testee should know to be a safe pilot at the level he 0r she is testing. Unfortunately, as do most, if not all of the governmental tests that I have taken, the testee is tested more on his or her ability to second guess the test writer’s intent than on the content of the subject matter. BUT, after 35 plus years of taking governmental tests, and teaching the subject matter, I do not expect any changes any time soon.

  24. Larry Hoffman Says:

    Studying and passing the knowledge test, to me, gave a good base to build on. As you stated, it would be grueling for the examiner to cover all of this on the oral exam. It also gives you and your instuctor areas that can be improved to make you a better pilot before you go for that check ride. We all know that the more that can be imbedded early in training, the better.

  25. Rich Says:

    A written test absolutely should be required, and while I don’t think all of the questions are always great, it’s really not that far off the mark. Every vocation has some combination of rote memorization and actual learning and this is no different. It’s good for students to go through all the material three times: once while learning to fly, once again preparing for the written (near the end of training), and once more right before/during the checkride. Having a standard written test eliminates any instructor bias and helps make sure all the appropriate areas were covered adequately by the instructor, something instructors should appreciate (I do!). It’s a third party assessment of where a student is at and students should appreciate that as well.

    As for the fee increase, yeah it’s a bummer, but it’s a drop in the buket when we talk about learning to fly.

    Keep the test with minor revisions, but don’t throw the baby out with the bathwater.

  26. Stephen Bell Says:

    I believe the knowledge test should be required for certification.

  27. Bradley Spatz Says:

    All tests have their problems and annoyances. And yes, some of those questions are just obstacles to master.

    But the written and practical tests provide a good balance. Preparing for the written, if done in earnest, famliarizes students with the FARs, AIM, and standard texts — key references that will serve the “licensed to learn” pilot well. And this methodology will apply to learning new aircraft and systems, flight planning, etc. I think the written exam serves an important role.

  28. Marty Fluharty Says:

    I took the Sport Pilot Knowledge Test mid Feb. at a Laser Grade site. It was irritating that two of the questions were based on four seat airplanes. A light sport aircraft has only two. Gotcha questions should be removed – there were two. The quality of the charts was so poor that they were not readable. Thank goodness there was a grafts and charts book available.

    My score was over 90 and I had studied for months, my instructor had grilled me each time I flew and I had practiced the Gleim computer tests. I learned a lot and some of it I know I’ll not use but it’s important. Having to study in order to do well on the test insured that I learned and I think I’ll be more ready for my checkride oral. The Sport test really needs to be cleaned up, more relevant to light sport aircraft and updated. Glad I made it before the price increase.

  29. Jim McSherry Says:

    Here’s another CFII who thinks that there should be some form of standardized, formal exam that encompasses at least the topics which are immediately appropriate to the rating being sought. I also see nothing terrible with two or three questions that are a bit “out there” (I recall a helicopter question on my Private ASEL test); just because you want to fly gliders now does not rule out the possibility of airships in the future.
    My beef, and it seems to be shared by others, is the quality of the questions themselves. I have taught physics, chemistry and engineering classes at college and high school level for over thirty years; some of the questions I have seen in the course of several FAA knowledge tests would have never made the grade at a rudimentary training course on test design. Student teachers do a better job. The interpolation questions are one case in point; another is the weather questions which ask one to identify the outcome of a system, rather than giving a description of conditions and requiring a diagnosis of the system or expected trend. That’s what we have to do as pilots, assess the future probability, not “name the system”.
    The hoops one must jump in order to verify a new test item are daunting, and the changes take time. But in the time I have been flying, there has been little change in the tests that I can see, or in the ones my students encounter.

  30. Bill Walker Says:

    CATS and Lasergrade have abritrarily increased the fees that THEY charge, but none of the increase is shared with the agencies that administer the tests. We’re losing an excellent site in SE Alabama because the costs have increased but not the compensation.

Leave a Reply

*