We did the test. Now what? Part 3: Differentiating Your Focus

Over the last two blog posts, I have looked at student scores by grade over the years of 2019-2022 and then compared the scores of my students to the school’s average scores. Today, I want to dig into differentiating your focus.

Read –> We did the test. Now what? Part 1: Looking at yearly performance over time.

Read –> We did the test. now what? Part 2: Year-over-year comparisons and data analysis.

Looking at Specific Students Over Time

Just as you can compare your classes’ results year over year, you can do the same for your students. I use these data trends to honor my “best students” at the end of the year. Besides “Most Proficient,” I have a data-based “Most Improvement” winner. Here is an example of how to work with longitudinal data for a single student. I’m using actual student data but have changed student names/initials to protect privacy. This specific student, whom we’ll call CW, was the second-most improved student in my senior-level College Credit Plus course this year.

AAPPL dashboard
Figure 1.

To start, simply use the AAPPL Longitudinal Report and export it to Excel. If this isn’t enabled in your LTI Client Site account, contact your LTI representative and ask them to enable AAPPL Student Longitudinal Reports.

To run your AAPPL Student Longitudinal Report, simply select the timeframe and language. You will get an output like the one below, which will include a list of multiple students and their scores across years of testing.

longitudinal reports
Figure 2. Here you can see CW grew from I2 to I5 in PW and IR skills; in IL they also exhibited improved performance. They went from I2 to I1 in ILS. Their performance in three skill areas jumped up significantly on the ACTFL Performance Scale over the course of a single year!

While this example shows longitudinal data for all of my students in the year range and language I specified, I can also search for a specific student. Let’s take another student whom I’ll call JH, for example. In the general output, I can see they showed signs of improvement (see screenshot below).

longitudinal report for one student
Figure 3. I see JH’s scores in the broad data output from my longitudinal report and decide to run a longitudinal report just on them.

Perhaps I want to isolate their scores and drill down a bit or maybe even sit down and show them their scores on my screen to discuss together. To look at a specific student, I enter the student’s name (in this case, I entered JH’s name) and click search; this generates an output like this example below.

longitudinal report for one student
Figure 4. Here I’ve isolated JH’s information in the longitudinal report. I can quickly and easily see their ILS and IL skills improved from 11th to 12th grade, and I can see their PW scores remained the same both years, even after taking a retest on PW (in 2022). Using this view, I can show them their results on the screen without exposing other students’ data.

Looking at Broader Trends

As you look at longitudinal data, you are able to see trends across years. It can be interesting to look at the changes and patterns from elementary to middle or middle to high school, as you see in the example below. Your ability to view across schools or grade levels will depend on the viewing access set up with your username in the Client Site. Longitudinal reports can empower you as a teacher to become familiar with students’ AAPPL data prior to coming to your grade level or your school. This can also empower teachers and administrators to see patterns in programs across a district or multiple schools.

longitudinal report
Figure 5. This image blurs out student information, but you can see the student data selected shows results for testing each year. This student earned I4 in ILS in 5th grade and then A1 in 7th grade. If I had this student then in 9th grade, I could already see some trends and have an idea of the student’s progress, regardless of who their teacher was or which middle school program they were in.

Playing with Data

When you look at the longitudinal reports, you also have the option to export the data to Excel. Simply enter your year range, language (Bubble 1), hit search, and then select Export to Excel (Bubble 2). This will allow you to conduct further analysis of the data if you’re comfortable with Excel and its functions. You can look at the data within a year, year over year, by years of instruction, and more. It might seem incredibly time-consuming to evaluate data like I’ve shown in this series of blog posts. But doing so has helped to inform my instructional strategies, recognize student development, and drill into specific points that might be important.

AAPPL Client site
Figure 6. Playing with data.

You can use AAPPL data to benefit your own instruction or even go beyond that to working with your school, program, or district. The first step is to take a look together with your colleagues and re-center the discussion around proficiency. Then it’s easier to all pull in the same direction! Our department has done more and more of this since our adoption of the AAPPL. Not only am I improving in my instructional strategies, but as a department we are all paying attention and working to aim our trajectory higher. As you dig into longitudinal reporting, you might be surprised what trends you find. The data can provide powerful information that can significantly impact the work you and your language departments do!

Validating Proficiency Benchmarks at a US Military Academy

By Pete Swanson, PhD, and Jean W. LeLoup, PhD, USAFA

Introduction

Some would say communicative Language Teaching approaches have helped shift the paradigm of world language teaching and learning. Proficiency testing has now been prioritized where program coordinators and others set proficiency benchmarks for language learners to achieve. Unfortunately, proficiency testing can be costly and many programs lack funds, which can inhibit such assessment. Nevertheless, several large universities in the United States (US) received federal funding under the Language Flagship Program to assess learners’ proficiency in a number of languages (Winke & Gass, 2019). Established at the turn of the 21st century, the Flagships made the call for institutions of higher education to create a “viable process to assess proficiency learning in high quality, well-established academic language programs” (Swanson et al., 2022, p. 2).

Heeding this call, researchers at the US Air Force Academy (USAFA) applied for funding to examine the oral proficiency of cadets studying Spanish. The purpose of the funding was to validate the proficiency benchmarks set forth by faculty members. These benchmarks are codified in the Spanish Language Roadmap, which specifies proficiency goals for each of the four years of language study at USAFA.

Methods and Findings

Following IRB approval, the researchers randomly selected cadets in second (N= 48), third (N= 53), and fourth year (N=28) Spanish language courses to participate in the study. Funding limited the total number of participants to 27 cadets. As a result, the 27 cadets took the Oral Proficiency Interview (OPI, Language Testing International, 2022) in the USAFA language lab in April 2022. Data were entered into SPSS version 28 for analysis.

With respect to the OPI ratings for those who had studied for 240 classroom hours (i.e. four semesters) at USAFA, results shown in Table 1 indicate that 94% of the participants attained or surpassed the benchmark (Intermediate-Mid) for this level.

Table 1
OPI Results for participants (N=16) who studied Spanish for 240 classroom hours (i.e. four semesters) at USAFA.
Proficiency Rating Number of Participants
Intermediate Low 1
Intermediate Mid 11
Intermediate High 4

Turning to the OPI ratings for those who had studied Spanish at USAFA for 320 classroom hours (i.e., six semesters), findings showed that 80% of the participants reached the benchmark (Intermediate High) while one reached the Intermediate-Mid rating for this level.

Table 2
OPI Results for participants (N=5) who studied Spanish for 320 classroom hours (i.e. six semesters) at USAFA.
Proficiency Rating Number of Participants
Intermediate Mid 1
Intermediate High 4

Finally, there were six participants who studied abroad for a semester at a foreign military academy in a Spanish-speaking country (Chile or Spain). All of these individuals took at least and received an OPI rating in the Advanced range; there was not a proficiency benchmark set for this particular group. Interestingly, none of these participants was a heritage Spanish speaker; 66% of the participants in this group were STEM (Scient, Technology, Engineering, and Math) majors.

 

Table 3
OPI Results for participants (N=6) who studied abroad for one semester at a foreign military academy in a Spanish-speaking country.
Proficiency Rating Number of Participants
Advanced Low 3
Advanced Mid 3

Summary

Preliminary results from the OPI testing are encouraging vis-à-vis the attainment of the benchmark targets. OPI results from the present study will be used to inform strategies for setting different proficiency goals for those who study abroad for a semester. Nevertheless, given the small number of participants in the study and the limited financial resources to conduct the research, the researchers call for more funding and investigation to corroborate and build on the present findings.

 

DISTRIBUTION STATEMENT A: Approved for public release: distribution unlimited. DISCLAIMER: The views expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the United States Air Force Academy, the Air Force, the Department of Defense, or the U.S. Government. PA#: USAFA-DF-2022-558

We did the test. Now what? Part 2: Year Over Year Comparisons and Data Analysis.

In my previous article, I took a high-level look at my 11th and 12th graders over three years (2019-2022). In this post, we’ll dig in a bit on the data analysis, or what I did as I looked at the data from AAPPL reporting tool. I started by looking at the top performance by grade for each year, creating my own Excel spreadsheet to dig in on the numbers a bit. I exported the data from the AAPPL report and laid it out as shown below.

Year over Year Comparison (Top Performance by Grade Highlighted) 

test analysis
Figure 1

Figure 1 captures my analysis in Excel of what years my students had the highest performance for each skill. The highest speaking performance for 11th graders happened in 2022, while the highest performance for 12th graders was in 2020. In writing, the highest performance for 11th graders happened in 2019; in 2020 for the 12th graders. In listening, 11th graders peaked in 2020, and 12th graders in 2019. Reading scores were highest for 11th graders in 2022, and in 2021 for 11th graders. Clearly, if I were to depict highest scores on a linear graph, it would not look like linear growth from year to year.

test analysis
Figure 2

Figure 2 shows the aggregate by grade and skill over three years. Looking at 2019-2022 average scores, 11th graders showed lowest average scores in listening and speaking and highest scores in writing. For seniors, lowest average scores were in reading and interpersonal listening and speaking; highest scores were in interpretive listening.

Now my deeper analysis begins. My next step is to compare my students to the averages within my school (see image below). I can see that in half of the modes and grades my students have performed above our school’s averages historically. There are some specific factors at play in different years, for example I tend to test students who are in our most advanced classes (College Credit Plus), so the expectation should generally be that they outperform our school’s averages. More importantly, I can compare my students’ performance to my prior students, which tells me that while my students from this year matched or outperformed prior years in terms of getting close to or beyond I-5 (our criteria for the Seal of Biliteracy), there has been a trend downward in speaking scores over the last three years.

Below you’ll see that I plugged my data into an Excel spreadsheet. I know not everyone is comfortable using Excel, but I didn’t use any advanced functions. I simply copied the data from the AAPPL Report and pasted it into Excel.

test analysis
Figure 3

Figure 3 shows my analysis comparing my students to the school averages. Again, I simply pulled the numbers from my AAPPL reporting and plugged into excel.

This analysis drives me to refine my conversational activities moving forward, increasing the number of scaffolded conversations with specific examples of how to give narration, rather than leaving most of my conversation activities open-ended. I suspect that part of this downturn is my overconfidence in my students based on classroom activities where they shine in a particular topic, as well as my belief that the simple fact that they participate in Spanish-language conversations with native speaker partners on a regular basis leads to increased proficiency. It is clear to me now that even in those settings my students might benefit from clearer instruction on how to participate more actively and completely in an Intermediate High/Advanced Low setting. Stay tuned for my next post on differentiating your focus.

Be More Than an Insurance Agent—Be an Agent of Change!

insurance agent talking to a young couple

Handling insurance details, claims, and sensitive and delicate information where emotions are involved can be a difficult task even when the policyholder and the insurance agent speak the same language. Can you imagine how challenging it can be when we add a language barrier into the mix? The insurance industry is aware of the need to recruit employees with adequate language proficiency levels to meet the diverse needs of customers today. If you are a bilingual or multilingual professional and your intention is to pursue a career in the growing and constantly evolving insurance industry, you should know that both employers and customers need you.

As stated by a survey conducted by Ipsos Public Affairs for ACTFL, most employers point out that the demand for languages other than English has grown over the past five years. Additionally, the ability to speak more than one language has become one of the top skills required in the delivery of different services, including insurance offerings. Establishing a connection between language skills and economic competitiveness is just one of the main results of this survey, which strives to raise awareness and action around the establishment of language competency in the United States, as specified in the report.

According to the Insurance Information Institute, the insurance industry is currently transforming to encourage and boost diversity in their organizational culture. A multilingual or bilingual insurance agent is a valuable resource that can make recommendations based on the specific needs of the community he/she works for or represents. Similarly, their commitment and meaningful participation in the community they serve is a vital piece of creating powerful relationships and gaining the trust of prospective clients. An article featured in AgentPipeline.com highlights that, “A more significant piece of what insurance agents do is educate our customers on how to ensure they are covered and help protect their financial security and teach them how to take advantage of benefits that are included in their insurance policies.” To become active in your community, they recommend volunteering at community events, supporting local businesses, and developing professional connections online with digital tools, as well as other activities.

Watch –> Leading with Understanding: Bilingual Insurance Agents Build Strong Working Relationships

One of the main goals of the insurance industry is to provide quality services. The insurance world recognizes that hiring professionals who can speak more than one language is an advantage that helps impart exceptional customer service that focuses on individuals’ preferred language. Communicating in an accurate way and overcoming language and cultural barriers can help build relationships, not just locally but internationally.

Helping customers understand what an insurance policy has to offer, especially those whose first language isn’t English, does create more business opportunities; however, and more importantly, it allows agents to make an impact in the lives of those they serve.

Contact Language Testing International (LTI) if you need to get certified in another language and provide your current or future employer and your customers with reliable and legally defensible language proficiency results. The validity of each assessment is supported by three decades of research. LTI makes the process simple with remotely proctored assessments in over 120 languages that you can take anytime, anywhere.