Internships & Fellowships



         The following information is maintained by the The Graduate Student Issues Committee (GSIC). 
          If you would like to have your internship or fellowship listed here please contact our GSIC co-chair:
          Delwin Carter: delwincarter@ucsb.edu
          This list will be compiled as internships become available for 2020!


Graduate Student Internships

Internships are a valuable way to link your academic experience with the professional arena. Below is a list of internships that will allow students to go beyond the classroom and conduct practical research with a mentor from a testing company or research agency.


Graduate Student Fellowships

Fellowships provide structured work experience and professional development that include intensive training and experiential learning. Below is a list of fellowships that provide support to the fellow's growth and opportunities to explore a particular field of measurement.

Cognia

Psychometric Externship
Cognia is offering a unique externship program for students interested in psychometrics or educational measurement. Cognia will grant up to two (2) externships for students currently working toward their Ph.D. in psychometrics. The program is designed to help students gain in-depth knowledge and understanding of how the theory and practice of psychometrics interact in the activities of Cognia psychometricians, while not having to relocate.

During the 30-week program, the primary focus will be on accomplishing a research project culminating in an NCME or AERA proposal. The research area can be almost any in educational measurement. Areas of research in the past have included: equating, parameter drift, dimensionality estimation, skills diagnosis, standard setting, response time modeling, vertical scaling, and multistage testing.

In general, externs will work closely with our team of psychometricians to further their understanding of how the discipline of psychometrics is applied in a customized educational assessment environment. The program is scheduled to run from January 6, 2019, to July 31, 2019. Externs would work on projects 8-10 hours per week (some weeks more, some less). The program would cover 3 trips to Dover, NH (at program start on January 6, spring break, and a trip for the final presentation in the last week of July). All transportation costs, lodging, and meals will be covered during the trips. Between trips, externs will work with a mentor via Skype. Extern will be loaned a Cognia laptop to facilitate access to Cognia psychometric resources.

Deadline for application: December 11, 2019 - please submit application with two professional or educational references. Cognia will pay a stipend of $8000 over the 30-week period covering the internship.

Eligibility
Eligible candidates will be enrolled in a doctoral program (i.e. psychometrics, educational measurement, or appropriate discipline) of a fully accredited university/college, with at least two (2) years of full-time postgraduate study leading toward a Ph.D. For more information and to apply, visit https://tinyurl.com/y5g6aw8c 

 

National Board of Medical Examiners 

National Board of Medical Examiners Internship

Summer 2020 Internships in Assessment Science and Psychometrics

June 1 - July 24, 2020   Philadelphia, PA

Overview

The National Board of Medical Examiners® (NBME)® is a mission-driven, not-for-profit organization that serves the public by developing, administering, and conducting research on high-quality assessments for healthcare professionals. 

NBME programs include the United States Medical Licensing Examination®; an extensive offering of achievement tests for courses offered by medical schools; and numerous client examinations in medicine and other health professions. The variety of assessment programs creates numerous opportunities for applied and theoretical research that can impact practice.

The NBME employs approximately 30 doctoral level psychometricians and assessment scientists, as well as several MDs specializing in medical education. Staff is recognized internationally for its expertise in statistical analysis, psychometrics, and test development. 

Interns will interact with other graduate students and NBME staff, and will present completed projects or work-in-progress to NBME staff. Internships typically result in conference presentations (e.g., NCME) and sometimes lead to publication or dissertation topics.

Requirements

  • Active enrollment in doctoral program in measurement, statistics, cognitive science, medical education, or related field; completion of two or more years of graduate coursework.
  • Experience or coursework in one or more of the following: test development, IRT, CTT, statistics, research design, and cognitive science. Advanced knowledge of topics such as equating, generalizability theory, or Bayesian methodology is helpful. Skill in writing and presenting research. Working knowledge of statistical software (e.g., Winsteps, BILOG; SPSS, SAS, or R). 
  • Interns will be assigned to one or more mentors, but must be able to work independently.
  • Must be authorized to work in the US for any employer. If selected, F-1 holders will need to apply for Curricular Practical Training authorization through their school’s international student office, and have a social security number for payroll purposes.

Compensation

Total compensation for the two months is approximately $9800, and is intended to cover all major expenses (food, housing, travel).   

Research Projects

Interns will help define a research problem; review related studies; conduct data analyses (real and/or simulated data); and write a summary report suitable for presentation.  Projects are summarized below.  Applicants should identify 2 projects by number that they prefer to work on.

  1. Application of Natural Language Processing (NLP) in the field of assessment: Application of Natural Language Processing (NLP) in the field of assessment has led to innovations and changes in how testing organizations design and score tests. Possible projects will investigate novel NLP applications, using real or simulated data, for various processes relevant in an operational testing program (e.g., test construction, key validation, standard setting). Results would be informative for possible improvements to current best practices.
  1. Modeling answer-change strategy in a high-stakes MCQ examination: In this project, we explore the use of the Rasch Poisson Count model (Rasch, 1960/1980) to extend the hierarchical speed accuracy model (van der Linden, 2007) to model the item revisits and answer change behavior patterns in a high-stakes examination collected in an experimental setting. We propose to connect the elements of process data available from a computer-based test (correctness, response time, number of revisits to an item, the outcome of the revisit to an item, IRT ability of examinee and IRT item characteristics) in a hierarchical latent trait model that explains examinee’s behavior on changing the initial response to the item. The relationship between working speed, ability, and the number of visits and number of answer changes can be modeled using a multidimensional model that conceptualizes them as latent variables. The model should help us better understand the answer change behavior and cognitive behavior of examinees in a timed high-stakes examination.
  2. Performance Assessments: The intern will pursue research related to improving the precision and accuracy of a performance test involving physician interactions with standardized patients. Possible projects include designing an enhanced process for flagging aberrant ratings by trained raters and supporting research on standardized patients in a high-stakes exam.
  3. Measurement Instrument Revision and Development: This project will involve revising a commonly-used measurement instrument so that the appropriate inferences can be made with regard to medical students. Duties will include the following: working with subject-matter experts to revise the existing items; conducting think-alouds with medical students; developing a pilot measure of potential items; exploratory and confirmatory factor analysis of initial pilot results to gather structural validity evidence; developing a larger survey to gather concurrent and discriminate validity evidence with the revised measure; and administration and evaluation of the larger survey.
  4. Characterizing (and Visualizing) Item Pool Health: The health of an item pool can be defined in a number of ways. Our current test development practices utilize have/need reports broken down by content area, and many content outlines are hierarchical in nature, with several layers of content coding and metadata. The problem is that the have/need ratios are, for the most part, one dimensional, but details within the “have” portion of these ratios represent multidimensional information that can be used to improve multiple aspects of test development, including form construction, test security, pool management/maintenance, and targeting of item-writing assignments. The aims of this project are two-fold: (1) develop helpful, easily-interpretable metrics to assess item pool health; and (2) employ a sophisticated visualization method of item pool health (e.g., via R Shiny, D3.js, .NET languages/libraries, etc) to assist in improving one or more aspects of test development.
  5. Item Tagging and Mapping with Natural Language Processing: Test content outlines and specifications often change rapidly within cutting-edge domains. In response to these changes, test development teams must “map” the pre-existing content onto the new content domains. Such a task is trivial when there are equivalent content domains between the new and old content outlines. However, this direct mapping rarely occurs, leaving item mapping to be done manually, a time-intensive task that is prone to human error and differences in subjective interpretations across humans. This project seeks to utilize and integrate natural language processing (NLP), machine learning (ML), and data visualization to (1) assist subject-matter experts with creating new content outlines; (2) help map items to new content domains; (3) review manual item mappings for accuracy as a quality control measure; and (4) visually represent the degree of content distribution within a group of items (e.g., test form, item bank, etc). A component of this project will be the utilization of sophisticated data visualization methods to allow subject matter experts and test development staff to more easily examine items in multiple contexts. Strong candidates for this position will have knowledge of Python or a similar language to utilize common libraries used in NLP (e.g., Keras, Tensorflow, Pytorch, etc)
  6. Computer-Assisted Scoring of Constructed Response Test Items: Recently the NBME has developed a computer-assisted scoring program that utilizes natural language processing (NLP). The two main components of the program are (1) ensuring that the information in the constructed response is correctly identified and represented; and (2) building a scoring model based on the these concept representations. Current areas of research surrounding this project include (but are not limited to): refining quality control steps to be taken prior to an item being used in computer-assisted scoring; linking and equating computer-assisted scores with human rater scores; evaluating a scoring method based on using orthogonal arrays; and developing metrics that assess item quality and test reliability when computer-assisted scores and human scores are used to make classification decisions. The final project will be determined based on a combination of intern interest and project importance.

Application

Candidates may apply by going to https://nbme.applicantpro.com/jobs/.  A cover letter outlining experience and listing project interests by number, along with a current resume, are required. Application deadline is February 3, 2020. 

 

All applicants will be notified of selection decisions by February 21, 2020.

 

Educational Testing Service (ETS)

2020- Educational Testing Service Internship

Interns in this eight-week program participate in research under the guidance of an ETS mentor. Each intern is required to give a brief presentation about the project at the conclusion of the internship. The internship is carried out in the ETS offices in Princeton, N.J. This year, projects may be conducted in the following research areas:

Research Area 1: English Language Learning and Assessment

Research Area 2: Career and Technical Education

Research Area 3: Teacher Diversity and Quality

Research Area 4: Design and Validity for Digital Assessment

Research Area 5: Modeling and Analyzing Examinee Response Processes

Research Area 6: Statistical and Psychometric Foundations

Research Area 7: Group-Score Assessment

Research Area 8: Applied Psychometrics

Research Area 9: Human and Automated Scoring

  • The application deadline is February 1, 2020.
  • Applicants will be notified of selection decisions by March 31, 2020.
  • Eight weeks: June 1, 2020–July 24, 2020
  • $6,000 salary
  • Transportation allowance for relocating to and from the Princeton area
  • Housing will be provided for interns commuting more than 50 miles
  • Current full-time enrollment in a relevant doctoral program
  • Completion of at least two years of coursework toward the doctorate prior to the program start date

For more information please view the ETS Internship Announcement.

 

American College Testing (ACT) Psychometric Intern

ACT Psychometric Research Intern

Overview

ACT is a nonprofit organization helping people achieve educational and workplace success. Our programs are designed to boost lifelong learning in schools and workplaces around the world. Whether it's guiding students along their learning paths, enabling companies to develop their workforce, fostering parent, teacher, and counselor understanding of student progress, guiding job seekers toward career success, or informing policymakers about education and workforce issues. ACT is passionate about making a difference in all we do.

Responsibilities

Position Objective: Conduct a study to investigate the usage of item latency information in detecting aberrant testing behaviors in computer-based tests.

Typical job activities include:

  • designing a research study

  • writing programs and conducting research

  • summarizing the results

  • writing research reports

  • presenting the outcome of the research and submitting a proposal based on the study to NCME/AERA

Qualifications

Minimum Qualifications

Education: Currently enrolled purusing a graduate degree in educational measurement, statistics, advanced statistics or a related field.

Knowledge, Skills and Abilities:

  • knowledge and research experience in different statistical models

  • computer programming skills in SAS, R, and/or Python

  • good writing skills

For more information please see the job posting.

American College Testing (ACT) Internship: Survey, Validity & Efficacy Research

ACT Survey, Validity & Efficacy Research Intern

Overview

ACT is a nonprofit organization helping people achieve educational and workplace success. Our programs are designed to boost lifelong learning in schools and workplaces around the world. Whether it's guiding students along their learning paths, enabling companies to develop their workforce, fostering parent, teacher, and counselor understanding of student progress, guiding job seekers toward career success, or informing policymakers about education and workforce issues. ACT is passionate about making a difference in all we do.

Responsibilities

Position objective: The summer intern will provide support in furthering ACT’s understanding of the environmental factors that influence students’ attitudes toward, decision-making about, and usage of test preparation. This work will help us better articulate the key levers in test preparation usage through a literature review and with the implementation of a research study. The results of this work will be used as the foundation to investigate the impact of the introduction of modular testing and superscoring on test prep. Co-authorship of publications and submissions to national conferences will also be considered.

Description of type of work/area of focus: The summer intern will capitalize on ecological factors (e.g., systems theory and thinking) in program evaluation to better understand test preparation use. Work will focus on studying ecological factors (e.g., the application of systems theory) in understanding current research on test preparation followed by the design and implementation of a pilot study. The study will help ACT to understand the environmental factors that influence students’ attitudes toward, decision-making about, and usage of test preparation. Work will include: a) the documentation of methodological approaches, b) a framework that applies these methodological approaches to test preparation, c) the implementation of a pilot study that applies this framework, and d) a write-up of findings from the pilot study. A paper will be written that synthesizes this work. Submission to national conferences will also be considered.

Typical work-related activities will include:

  • Conducting a review of the literature on topics of focus

  • Provide support in designing and implementing a study (e.g. development of interview protocols or surveys)

  • The analysis of pilot study data

  • Participate in regular meetings with supervisors

  • Present findings to technical and non-technical audiences

  • Coauthor papers

Qualifications

Minimum Qualifications:Candidates should be currently enrolled in a doctoral graduate program in program evaluation or in social science (e.g., education, psychology, sociology) research program. Completion of two years of graduate school at time of internship required.

Experience Requirements/Preferences: Currently enrolled in a relevant doctoral program with program evaluation and research design training. Prior experience conducting evaluation/research preferred.

Knowledge, Skills and Abilities:Knowledge of and experience in program evaluation and/or systems theory, research methods (mixed methods preferred) and good professional writing/presentation skills are required

For more information please see the job posting.

American College Testing (ACT) Assessment Transformation Internship

ACT Assessment Transformation Internship

Overview

ACT is a nonprofit organization helping people achieve educational and workplace success. Our programs are designed to boost lifelong learning in schools and workplaces around the world. Whether it's guiding students along their learning paths, enabling companies to develop their workforce, fostering parent, teacher, and counselor understanding of student progress, guiding job seekers toward career success, or informing policymakers about education and workforce issues. ACT is passionate about making a difference in all we do.

Responsibilities

Position Objective: A successful intern will learn about methods for studying the comparability of paper and online assessments and develop statistical analysis programs to examine mode comparability. Through this work, the intern will gain experience with common psychometric analyses supporting large-scale assessment programs. The intern will work with Assessment Transformation staff to write an AERA or NCME proposal describing recent mode comparability studies for the ACT.

Qualifications

Minimum Qualifications: Currently pursing a graduate degree in educational measurement, psychometrics, educational statistics, educational psychology or a related field.

Experience Requirements/Preferences: Two years of doctoral studies in a relevant field

Knowledge, Skills and Abilities:

  • classical test theory

  • item response theory

  • equating

  • statistical analysis and programming

  • data visualization

  • technical writing

  • teamwork

For more information please see the job posting.

ETS Post Doctoral Fellowship 

ETS Post Doctoral Fellowship

Description

Individuals who have earned their doctoral degree within the last three years are invited to apply for a rewarding fellowship experience which combines working on cutting-edge ETS research projects and conducting independent research that is relevant to ETS's goals. The fellowship is carried out in the ETS offices in Princeton, N.J. This year we are seeking applicants with experience in the following areas:

  • Applied Psychometrics
  • Artificial Intelligence Based Automated Scoring
  • Modeling and Scoring Item Responses from Interactive and Simulation-Based Assessments
  • Modeling of Response Processes and Response Times
  • Psychometric Issues in Adaptive Testing Designs
  • Statistical and Psychometric Foundations
  • Statistical and Psychometric Issues in Group-Scored Assessments
Program Goals
  • Provide research opportunities to individuals who hold a doctorate in the fields indicated above
  • Enhance the diversity and inclusion among underrepresented groups in conducting research in educational assessment and related fields
Important Dates
  • March 1, 2020 — deadline for preliminary application
  • April 15, 2020 — deadline for final application materials
Duration of Program

The fellowship is for a period of up to two years, renewable after the first year by mutual agreement.
Compensation

  • Competitive salary
  • $5,000 one-time relocation incentive for round-trip relocation expenses
  • Employee benefits, vacation, holidays and other paid leave in accordance with ETS policies
Eligibility
  • Doctorate in a relevant discipline within the past three years
  • Evidence of prior independent research

 For more information please visit the ETS Post Doctoral Fellowship announcement.

ETS Harold Gulliksen Psychometric Research Fellowship 

ETS Harold Gulliksen Psychometric Research Fellowship

Description

During the summer, selected fellows are required to participate in the Summer Internship Program in Research for Graduate Students, working under the guidance of an ETS mentor. During the subsequent academic year, fellows study at their universities and carry out research under the supervision of an academic mentor and in consultation with their ETS mentor.

Program Goals

The goal of this program is to increase the number of well-trained scientists in educational assessment, psychometrics and statistics.

Important Dates
    • December 31, 2019 — Deadline for receipt of preliminary application materials
    • January 15, 2020 — Applicants are notified of preliminary application decision
    • February 14, 2020 — Deadline for receipt of final application materials
    • March 31, 2020 — Award recipients are notified
Duration of Program

Appointments are for one year.
Award Value

Each fellow's university receives the following:

  • $20,000 to pay a stipend to the fellow
  • $8,000 to defray the fellow's tuition, fees and work-study program commitments
  • A small grant to facilitate work on the fellow's research project

Selected fellow must participate in the Summer Internship Program in Research for Graduate Students. The fellow will receive the following:

  • $6,000 salary
  • Transportation allowance for relocating to and from the Princeton area
  • Housing will be provided for interns commuting more than 50 miles
Eligibility

At the time of application, candidates must be enrolled in a doctoral program, have completed all the coursework toward the doctorate, and be at the dissertation stage of their program. Dissertation topics in the areas of psychometrics, statistics, educational measurement or quantitative methods will be given priority. At the time of application, candidates will be asked to provide a statement describing any additional financial assistance such as assistantship or grant commitment that he/she will have during the fellowship period.


For more information please visit the Harold Gulliksen Psychometric Research Fellowship announcement.