To chart your students' progress, use the Graduation Requirements Report. This report is your best view of how students are doing throughout the course.
To view the Graduation Requirements Report:
- Hover over the orange Reports tab, click New Reports.
- Find and click on the Graduation Requirements Report.
- Choose all the information you want to include in the report: goal set, student(s), and shift type.
- Click Go to display the report.
- If you like, click the PDF icon to get a printer-friendly report.
To customize your own goal sets,
- Click the +Goal Set button on the Graduation Requirements Report page.
- From the Custom Goals page, click the Add new goal set button.
- Fill out the form and be sure to hit Save at the bottom of the page.
You can create goals that are specific to your different EMT, AEMT, and Paramedic class requirements. The default goal set (National) is based off the USDoT’s 1998 National Standard Curriculum. The Virginia goal set reflects the state’s 2013 Clinical Hour and Competency List. Feel free to use either goal set template and set goal numbers that reflect your program’s standards. Read more below about why customized graduation requirements are so important for accreditation and demonstrating established program requirements in Fisdap.
To figure out why students may be getting credit for observing Team Leads rather than performing them, use the Observed Team Lead Report.
- Hover over the orange Reports tab, click New Reports.
- Find and click on Observed Team Lead Report.
- Choose the date range, desired format, and student name. Then click Go.
The Observed Team Lead Report can also be a great resource when students are concerned about why they aren't getting credit for performing the team lead.
Many programs, especially if they are using the National Standard Curriculum goals as a guide, require students to perform the patient exam and the patient interview (collectively known as the comprehensive assessment) to get credit for performing the Team Lead. This report identifies shifts where students did not perform the Team Lead, patient exam, and/or patient interview. You can also click a link and go directly to the shift in question from the Observed Team Lead report output.
If students are not getting credit for performing the team lead, it may be because they indicated they observed the patient exam and/or interview, or they may have not entered that section of data entry.
If students wish to correct their data entry because something was recorded improperly, you will need to open the shift if it has been locked.
The concept underlying the Eureka Graph originated from Dr. M.E. Wilson in England in 1991. Wilson studied 12 ambulance staff members, 14 Royal Navy medical assistants, and eight medical students during a two or three week training attachment to the department of anesthesia at the Royal United Hospital in England.
Wilson used a simple graph to chart each participant’s intravenous cannulation (IV) and endotracheal intubation (ET) attempts. For each successful attempt, the student’s graph climbed one notch and for each failure the graph descended one notch. Participants and staff reported that they found this method of charting their progress enjoyable.
Wilson defined competency as the point at which the slope of the curve on the participant’s graph reached a consistent 39 degree elevation (corresponding to an 80% success rate). In order to claim evidence of mastery, Wilson suggested that the student’s performance curve needed to stay above the 80% mark for more than 20 trials or attempts.
This concept of competency was first introduced to EMS education in the United State by Kim Grubbs from Johnson County Community College (Overland Park, Kansas) at the 1995 EMS Today Conference. We call the point at which the student's performance dramatically improved the Eureka Point.
Here's an explanation of how the Eureka Graph works for field shifts:
- For each successful skill attempt, the student's graph climbs one point; and for each unsuccessful attempt, the graph drops one point.
- When the student reaches an 80% success rate over 20 attempts, the graph shows the moment of skill competence by drawing a dashed line (representing continued 80% success rate) as a continued reference point.
- The colors on the graph help determine a trend in performance. Red indicates that there are fewer than 10 attempts and/or that the success rate is less than 60%. Yellow indicates that the success rate is between 60-79% in the last 10 attempts. Green indicates that the success rate is above 80% in the last 10 attempts.
As you can see from the legend at the top of the graph, this is a report that was generated for Student 220. The report was requested for all IV attempts between August 1, 1998, and October 31, 1998. This student had performed 28 attempts in that period of time. This student has not yet reached the point of competency.
Student 66 achieved competency in 29 IV attempts. This graph is very typical of most Eureka Graphs. The line hugs the baseline at the beginning, and then begins to gain a steady success climb.
The student remains close to the 80% (dashed) reference line after reaching the competency point.
This student also reached a competency point, but did so much later in the program.
Here's how the Eureka Graph works for lab shifts:
While the Eureka Graph works the same for lab shifts, you have greater greater flexibility with establishing goals for when students hit the Eureka Point.
You can customize the Eureka Graph by setting the numbers for successes over attempts. You could use any numbers that you think are appropriate, and the student will hit the Eureka Point when he achieves X number of successes over Y attempts.
As an example, you could type in 4/5, 16/20 or 80/100 and so on depending on the requirements of your program.
For more information about the origin of the Eureka Graph, you can read the following journal articles:
Fisdap renamed the Goals Report to the Graduation Requirements/Goals Report to help make the distinction between "goals" and "requirements."
We have used the term "goals" to refer to the number of skills or patient contacts students should obtain during their internship. However, the word "goal" does not imply or mean the same thing as "requirement" and we changed the language to reflect the standard that student performance should be tracked against what a program's requirements are for successful completion of the program, i.e. graduation.
Since the Goals Report continues to function the same, we have kept the word "goals" in the name of the report so that people who are accustomed to finding the Goals Report will not be confused. Also, programs can set benchmark goals to measure student progress throughout different phases of the internship, in addition to and separately from, the program's graduation requirements.
We have also added a link to the Graduation Requirements Report from the Accreditation Reports tab so that it's easy to find in the context of accredtitation or during a site visit. The Graduation Requirements Report may also be referred to as a "summary tracking document."
On behalf of the CoAEMSP, Pat Tritt wrote the following for an article on the importance of setting graudation requirements as part of tracking students' patient encounters. She has given us permission to share it with you, and we think it's a valuable reference for this topic.
Tracking Patient Encounters
Patricia L. Tritt, RN, MA
One of the challenges for Paramedic programs, and students, continues to be the tracking/documentation of patient encounters and skill events. The CAAHEP Standards and Guidelines for the Accreditation of Educational Programs in the Emergency Medical Services Professions requires that “The program must track the number of times each student successfully performs each of the competencies required for the appropriate exit point according to patient age, pathologies, complaint, gender, and interventions” (Standard III.C.2.). Unfortunately, adequate tracking, documentation, and the ability to produce summary reports are a common citation at the time of the site visit.
In the days of the DOT Curriculum, suggested numbers were provided. With the implementation of the National Education Standards, recommendations for numbers are no longer identified. CoAEMSP allows programs to establish their own minimum requirements: however, they must be rationally determined. Achieving competency should always be the goal. Some caveats:
- The minimum requirement must be determined by the communities of interest based on the outcome: I.E. all graduates are competent entry level Paramedics in all domains.
- The minimum requirement is not determined by what is available: just because you don’t have a good pediatric rotation does not mean it is OK to set low requirements.
- And in that vein, one is never enough of anything.
Another area of confusion is the terminology of goal versus requirement. Some tracking programs may use the word goal and programs interpret this to mean that the number is just a target and not all students have to reach it to successfully complete the program. CoAEMSP requires that every graduating student has achieved the minimum number set in every category.
Once all the data is gathered, how is it organized and presented? Program faculty must monitor student progress towards meeting requirements on a regular basis and may need to adjust clinical or field internship sites and shifts to assist the student in obtaining the necessary experiences. There should be an ongoing dialog between the clinical coordinator and student regarding progress.
Documentation should clearly show the minimums and requirements for each student. A summary report is also required that lists each student in the cohort, each required ‘event’, the minimums in each category, and the total number achieved by each student in each category. This is required for the site visitors to be able to easily determine that all graduates met the requirement.
Remember to remove students from the summary report who dropped or did not complete the program for any reason: otherwise it will appear that students graduated that did not meet program requirements.
Note that some programs also track when a student ‘observes’ an assessment or skill but this cannot count toward the required minimums.