Using Professional Competencies to Assess Learning in Student Affairs Graduate Programs

By Dr. Phyllis McCluskey-Titus, Associate Professor, Department of   Educational Administration and Foundations; Ramo Stott, Residential Life Coordinator, Louisiana State University; Catherine Poffenbarger, Office of the Provost, Illinois State University; Kaitlin Ballard, Residence Hall Director, Illinois Wesleyan University

Originally published in Progressive Measures, Fall 2014, Volume 10, Issue 1. 

The purpose of this research project was to develop and pilot a process to assess learning in student affairs graduate programs.  Using the ACPA/NASPA Competency Areas for Student Affairs Practitioners (2010), our research team created an assessment instrument to identify levels of competency reported by students entering a graduate program in student affairs, midway through the  program, and at the end of their program.  This research was made possible thanks to a grant provided by the NASPA Assessment, Evaluation, and Research Knowledge Community.  This article introduces our student affairs graduate program and the professional competencies, briefly reviews previous research, explains the assessment instrument and process, discusses findings, and shares our plans for continuing this research on other campuses.  This study contributes to the existing research on graduate students’ learning and provides evidence to support curriculum revision and enhancement of other learning experiences that student affairs master’s degree students need for success in professional practice, using the ACPA/NASPA competencies as a model.

The CSPA Graduate Program at Illinois State University

College Student Personnel Administration (CSPA) is a two-year Master of Science degree program preparing students to be successful professionals in entry-level student affairs positions in colleges and universities.  Coursework covers areas such as student affairs law, history and foundations of student affairs, student development and organizational theory, and educational research and statistics.  To provide a holistic learning experience, the program has an emphasis on theory-to-practice, requiring students to hold a graduate assistantship (GA) or full-time employment at a college or university while enrolled in the program and complete one or two practica.  All practical experiences must be within functional areas of student affairs.  GAs are one or two year-long work experiences in which students spend 20 hours per week (minimum) working in a student affairs office at a Bloomington-Normal area college or university, such as Illinois State University, Illinois Wesleyan University, Lincoln College-Normal, or Heartland Community College.  Examples of offices in which students hold GAs include residential life, student activities, and academic advising.  Practica are semester-long, 150-hour internships held in higher education institutions, including those previously listed, institutions across the nation, and occasionally the globe.  The opportunity for up to four different practical experiences allows CSPA students to gain experience in different functional offices and types of institutions.  Admission to the CSPA program requires completion of a graduate application, completion of the Graduate Record Examination (GRE) for applicants with undergraduate GPAs below 3.0, transcripts from all higher education institutions attended, a personal statement, a résumé, and a program interview with a CSPA faculty member and current students.  Students entering the CSPA program come from a wide variety of academic disciplines, spanning the arts and sciences to business, and the program includes recent bachelor’s degree graduates and current working professionals seeking a master’s degree (http://education.illinoisstate.edu/ms_cspa/).  The average undergraduate GPA for new students is 3.6.

ACPA/NASPA Competencies

In 2010, both professional associations in student affairs (ACPA: College Student Educators International, and NASPA: Student Affairs Administrators in Higher Education) collaborated on the development of a comprehensive listing of competencies for student affairs professionals at all levels of their careers.  Ten competency areas were developed: Advising and Helping; Assessment, Evaluation, and Research; Equity, Diversity, and Inclusion; Ethical Professional Practice; History, Philosophy, and Values; Human and Organizational Resources; Law, Policy, and Governance; Leadership; Personal Foundations; and Student Learning and Development (ACPA/NASPA, 2010).  In addition, competencies were assigned into the categories of Basic, Intermediate, and Advanced so that professionals could continue their growth within the field over time.  These competencies allow individual staff members, offices, and divisions within student affairs to develop training and professional development programs to better prepare for higher-level positions.  A preferred outcome for students graduating from Illinois State University’s CSPA program would be demonstrated competence at the Basic level across all ten competency areas.

 

Assessment Team/Process

The purpose of this grant-funded research project was to develop and test a process for assessing learning in student affairs graduate programs using the ACPA/ NASPA competencies.  The assessment team was comprised of a faculty member who coordinates the CSPA program and three graduate students who recently completed the CSPA program and have begun working in full-time positions.  At the time of the bulk of the study, the three new professionals were in their final year of the CSPA program.  After being awarded the grant, the team began meeting in April 2013 to develop the assessment instrument, complete Institutional Review Board (IRB) approval forms, and collect and analyze data.  The team discussed all aspects of the project, completed necessary training and base-level information-seeking, divided up tasks, and set out to complete individual parts of the project.  The team met regularly throughout the summer 2013 and into the 2013-2014 academic year to share and review progress towards completion of the study.

Literature Review

Little research has been conducted on ways that graduate students develop competencies in student affairs preparation programs.  Existing literature states that competencies are cultivated through curriculum, graduate assistantships, and practica (Kuk & Banning, 2009; Renn & Jessup-Anger, 2008) but is unclear about which aspects of graduate work contribute to learning specific competencies.  Komives and Smedick (2012) suggested that using standards and outcomes developed by professional associations can help validate programs and services.  The primary competencies to be considered in this assessment are those developed by the two primary student affairs professional associations in the ACPA/NASPA competencies.  Researchers generally agree that new professionals leave student affairs graduate programs highly competent in knowledge of theory, technology, problem solving, program planning, ethics, and standards of practice (Cuyjet et al., 2009; Dickerson et al., 2011; Herdlein, 2008; Waple, 2006).  Research also suggests that new professionals are less prepared at the master’s level in strategic planning, budgeting and finance, research, and assessment (Herdlein, 2008; Waple, 2006).  In a study by Hoffman and Bresciani (2010) that analyzed student affairs position descriptions, about 27% of jobs posted at a placement conference required competency in assessing student learning outcomes.  Both new and senior-level staff believe competency in planning, budgeting, and assessment are important to new professionals in practice (Cuyjet et al., 2009; Dickerson et al., 2011); therefore, it is necessary that students leaving graduate programs have acquired at least basic skills in each of the primary competency areas.

Methodology

The ACPA/NASPA Competency Areas for Student Affairs Practitioners (2010) defined ten categories of competency and 124 Basic outcome areas in which student affairs professionals are expected to develop a level of confidence in their practice.  Using these Basic level outcomes, the research team developed an assessment instrument to gain a better understanding of competency development of individuals enrolled in student affairs master’s degree programs.  A pilot study was conducted using Illinois State University students enrolled in and alumni who recently graduated from the CSPA master’s degree program (n = 68).  The research design included an online survey administered through Select Survey that was composed of 151 questions: 7 demographic items, 124 ratings of Basic level competencies, 10 rankings of areas where the competencies were developed, and 10 open-ended response questions.  Participants self-reported their level of development on all ten competencies using a seven-point scale: 1 = None, 2 = Very weak, 3 = Some, 4 = Moderate, 5 = Confident, 6 = Strong, and 7 = Exceptional.  Each of the ten competencies had between eight and seventeen outcomes to rate, yielding a total of 124 rating questions.  In addition, participants were asked to report where each competency had been developed or learned.  Participants could rank up to three different learning venues from a list of 11 areas: Family/personal life, Undergraduate coursework, Undergraduate involvement, Graduate coursework, Graduate assistantship, Higher education employment, Non-higher education employment, Practicum/ internship, Professional association, Mentor, or Volunteer experiences.

Students who were invited to participate in this pilot study during fall 2013 semester included all current first-year (n = 26) and second-year (n = 24) CSPA students and all May/August 2013 graduates of the program (n = 18).  A total of 51 usable surveys were started, for a 75% response rate, although only 46% completed the surveys.  There were 35 women and 16 men who completed the survey.  Thirty-nine of the respondents were full-time students in the program, 8 were part-time students, and 3 completed the program with a blend of full-time and part-time status.  There were 15 participants from the 2013 graduating class, 15 from the 2014 class, and 21 from the 2015 class.  The participants were contacted via email and asked to complete the online survey.  The participants were given two weeks to complete the survey.  In addition to the initial email, a reminder email was sent with a week remaining, and two reminder notices were posted on the CSPA 2012-2013 and CSPA 2013-2014 Facebook group pages.  Ten $20 gift cards were provided as incentives to complete the survey and were awarded to respondents through a random drawing.

After data were collected, the results were analyzed using Excel.  Demographic data, such as level of program completion, gender, and full- or part-time enrollment status, were examined first in order to contextualize the competency-specific information.  The data for each competency stem rating were studied broadly to assess the average development level reported for each competency as a whole.  The development area data were also reviewed to measure the frequency of each type of response.  From there, the competency areas were divided, and each member of the team was assigned two or three competency results to analyze in depth.  For the detailed analysis of each competency, the highest- and lowest-ranked stems were identified, along with the most-frequently cited areas where participants reported that learning took place.  Furthermore, patterns and trends were drawn from the qualitative open-ended responses.

As part of completion of the CSPA program, graduating students are required to complete a one-on-one interview with the program advisor.  During these exit interviews, participants had the opportunity to respond to additional questions regarding competencies to supplement information that the research team collected from the online survey.  Fifteen students who graduated in May 2014 participated in the exit interviews.  The advisor gave participants a worksheet and asked them to indicate the competency area(s) where they felt most prepared, least prepared, and most improved directly due to completion of the master’s program.  Participants were also asked to note any additional comments about their development of competencies within the program.  Twelve of the fifteen participants made additional comments.  Completing the worksheet encouraged conversation between the participant and advisor, including additional recommendations that would enhance student learning within the graduate program.

Results

For each of the ten competencies assessed, overall mean scores related to competency development are provided in Table 1.  In addition, mean scores are offered as comparisons between first-year students, second-year students, and program graduates; male and female respondents; and full-time and part-time students.

Table 1.  Mean scores of Competency Survey respondents by class, gender, and student status

Competencies All responses (n = 51) 1st year

students

(n = 21)

2nd year

students

(n = 15)

2013 graduates (n = 15) Female

respondents (n = 36)

Male

respondents (n = 15)

Full-time students (n = 39) Part-time students (n = 8)
Advising and Helping 4.32 4.15 4.50 4.42 4.36 4.21 4.33 4.33
Assessment, Evaluation, and Research 3.05 2.30 3.69 3.74 3.13 2.63 3.10 2.81
Equity, Diversity, and Inclusion 4.21 4.09 4.53 4.12 4.25 4.00 4.27 3.97
Ethical Professional

Practice

4.19 4.08 4.42 4.15 4.22 4.04 4.11 4.32
History, Philosophy, and Values 4.13 4.09 4.36 3.99 4.22 3.71 4.08 3.80
Human and

Organizational Resources

3.79 3.48 4.32 3.80 3.87 3.38 3.70 4.07
Law, Policy, and

Governance

2.99 2.25 3.58 3.62 3.07 2.58 3.06 2.46
Leadership 4.28 4.06 4.56 4.39 4.29 4.26 4.20 4.49
Personal Foundations 4.73 4.69 4.80 4.71 4.77 4.50 4.63 5.07
Student Learning and

Development

4.19 4.35 4.22 3.93 4.24 3.95 4.14 4.52

Note.  When analyzing data, original scores of 1 through 7 were recoded as 0 through 6 so as not to add value to competencies that students did not possess.

The results of this study were largely in line with existing literature on ACPA/NASPA competencies.  The two areas that were significantly lower across all demographics were Law, Policy, and Governance, and Assessment, Evaluation, and Research.  As noted in Table 1, the scale is weighted to measure from 0 to 6.  A vast majority of the averages show that students rated themselves just above average with a score of about 4 out of 6.  Personal Foundations was easily the highest result, with an overall average of 4.73 out of 6.  This strong sense of personal foundations is very evident in part-time students, who overall scored 5.07 out of 6.

Keeping in mind that the areas of learning in graduate school can be a complex combination of multiple factors, Table 2 reflects some variations in demographics that cannot be seen in Table 1.  These results combine demographics to reflect multiple identities and how these may impact learning in the graduate program.  As can be expected, second-year students and recent graduates tended to rate themselves higher than first-year students.  This is especially telling in Law, Policy, and Governance which averaged more than a one-point increase after the first year.  This is almost certainly due to the Higher Education Law course that is not offered to first-semester students.  Other areas of notable improvement coming after the first year of study within the CSPA program are Assessment, Evaluation, and Research; and Student Learning and Development.

The areas where students reported learning these Basic competencies are reported in Table 3.  Although there were eleven choices from which survey respondents could choose, the results indicated that most students reported learning from Graduate coursework, Graduate assistantships, and Higher education employment.  Family/personal life was also relevant in the development of some competencies that are potentially more likely to vary depending on the temperament and personality of the individual (i.e., Advising and Helping; Equity, Diversity, and Inclusion; and Personal Foundations).  Aside from Personal Foundations, Graduate assistantship and Graduate coursework combined to be the most frequent primary origins of learning reported.

The results of the exit interviews conducted in the spring semester support the findings that were reported in the survey by second-year students during the fall.  Participants stated that they believe they are most prepared in Student Learning and Development and in Advising and Helping, which were reported as the second- and third-highest levels of competency among second-year students.  In addition, students reported they felt most unprepared in Assessment, Evaluation, and Research and in Law, Policy, and Governance, which were the two-lowest reported competencies.  Students indicated that they gained the most competency development specifically related to the CSPA program in Student Learning and Development and in History, Philosophy, and Values.  The survey results showed that the majority of participants indicated their learning in these two competencies took place in Graduate coursework and Graduate assistantships.

 

Table 2.  Mean scores of competency survey respondents accounting for intersectionality

Competencies All

responses

1st year students 2nd year students 2013 graduates Full-time students Part-time students
Female Male Female Male Female Male Female Male Female Male
Advising and Helping 4.32

(n = 47)

4.30

(n = 15)

3.81

(n = 6)

4.45

(n = 10)

5.00

(n = 1)

4.33

(n = 10)

4.58

(n = 5)

4.36

(n = 28)

4.29

(n = 11)

4.33

(n = 9)

3.62

(n = 1)

Assessment, Evaluation, and Research 3.05

(n = 36)

2.37

(n = 13)

2.08

(n = 4)

3.69

(n = 9)

(n = 0)

3.75

(n = 8)

4.11

(n = 2)

3.26

(n = 21)

2.67

(n = 5)

2.97

(n = 8)

3.56

(n = 1)

Equity,

Diversity, and Inclusion

4.21

(n = 36)

4.07

(n = 13)

4.15

(n = 4)

4.53

(n = 9)

(n = 0)

4.22

(n = 8)

3.71

(n = 2)

4.32

(n = 21)

4.03

(n = 5)

3.99

(n = 8)

3.83

(n = 1)

Ethical

Professional

Practice

4.19

(n = 34)

4.09

(n = 12)

4.03

(n = 3)

4.42

(n = 9)

(n = 0)

4.17

(n = 7)

4.06

(n = 2)

4.12

(n = 19)

4.07

(n = 5)

4.46

(n = 8)

3.89

(n = 1)

History,

Philosophy, and Values

4.13

(n = 31)

4.31

(n = 11)

3.27

(n = 3)

4.36

(n = 8)

(n = 0)

3.90

(n = 7)

4.33

(n = 2)

4.16

(n = 19)

3.60

(n = 4)

4.29

(n = 6)

4.07

(n = 1)

Human and

Organizational Resources

3.79

(n = 31)

3.56

(n = 11)

3.18

(n = 3)

4.32

(n = 8)

(n = 0)

3.84

(n = 7)

3.68

(n = 2)

3.80

(n = 19)

3.22

(n = 4)

4.32

(n = 6)

4.00

(n = 1)

Law, Policy,

and

Governance

2.99

(n = 31)

2.58

(n = 11)

2.31

(n = 3)

3.58

(n = 8)

(n = 0)

3.68

(n = 7)

3.38

(n = 2)

3.17

(n = 19)

2.56

(n = 4)

3.05

(n = 6)

2.69

(n = 1)

Leadership 4.28

(n = 31)

4.03

(n = 11)

4.18

(n = 3)

4.56

(n = 8)

(n = 0)

4.39

(n = 7)

4.38

(n = 2)

4.22

(n = 19)

4.07

(n = 4)

4.57

(n = 6)

5.00

(n = 1)

Personal

Foundations

4.73

(n = 31)

4.80

(n = 11)

4.33

(n = 3)

4.80

(n = 11)

(n = 0)

4.70

(n = 7)

4.75

(n = 2)

4.69

(n = 19)

4.38

(n = 4)

4.95

(n = 6)

5.00

(n = 1)

Student

Learning and

Development

4.19

(n = 31)

3.88

(n = 11)

3.30

(n = 3)

4.22

(n = 11)

(n = 0)

4.57

(n = 7)

4.86

(n = 2)

4.18

(n = 19)

3.94

(n = 4)

4.29

(n = 6)

4.00

(n = 1)

Note.  The survey included an array of gender responses, but male and female were the only ones to garner responses.

Implications for Program Enhancement

In addition to individuals being able to better see what skills and competencies they gained from their graduate experience, the results of this assessment can also help with program-level decisions about where student learning can be enhanced.  Information obtained from the online survey and exit interviews provided a number of recommendations for the CSPA program at Illinois State University.

Areas of strength

A significant sign of strength in the CSPA program was that participants reported being moderately competent to competent across the full range of 124 Basic level competencies of the ACPA/NASPA Competency Areas for Student Affairs Practitioners (2010).  Another key finding was that most learning was reportedly obtained through CSPA program courses.  With so much learning taking place in the classroom, it is clearly very important that the curriculum and its alignment with the NASPA/ACPA competencies are continually evaluated.  Last summer during the CSPA program faculty retreat, the main topic of conversation was curricular alignment with the professional competencies.  Faculty indicated, through discussion and sharing learning outcomes and assignments, which courses prepared students in each of the Basic level competencies.

Another notable finding was that skills learned in the classroom were more highly rated when students had opportunities to immediately practice these skills outside the classroom.  For example, Advising and Helping skills were used on a regular basis in most students’ practical experiences, whereas Law, Policy, and Governance were skills not commonly used in graduate assistantships.  One participant shared, “Being able to put that knowledge to practice during summer practicum helped me gain confidence in using the competencies.”  Graduate assistantships were cited as important to student learning, so maintaining contact with supervisors of CSPA students and determining the extent to which the Basic competencies are practiced or reinforced in those offices could be particularly helpful.  Another graduating participant shared, “Many of the competencies that I feel I am most prepared in are as a result of my assistantship and practicum experiences…It is very important that students put great thought and intentionality into where they serve as GAs and practicum students.”

This also has implications for program advisement and ensuring that students are choosing graduate assistantships and practicum sites that will enhance their skills and help direct them to the types of student affairs careers they want.

An aspect of the CSPA program that is heavily emphasized is the concept of theory-to-practice.  This model involves exposing students to a number of foundational student development theories and giving students opportunities to utilize skills and insights of the theories in practical settings.  This method was affirmed through the survey results.  Participants believed they were competent in using theories in their practice, rating the Student Learning and Development competency overall at 4.19.  All students surveyed had completed or were finishing the primary course in student development theory, which suggests that the required course focusing on this topic allowed them to feel competent using theory in their work.  The current placement of this course at the very beginning of the program seems warranted, and there is clear value of a theory-to-practice emphasis to both current students and recent alumni.  One participant commented on the theoretical foundation of the program stating, “Every class and project completed during the program involved student development (theory).”

These findings seemed to validate the current curriculum, program emphasis, and application of classroom learning in work, assistantship, and practicum settings.  It is interesting that most students reported that the competencies were learned or developed in graduate school.  Perhaps this has occurred because there is no undergraduate major in student affairs, or perhaps the graduate experience is the most recent, and thus the easiest, place in which to attribute these learning outcomes.  This could be an area for further exploration in future studies.

Areas of weakness

Participants reported lower competence in Assessment, Evaluation, and Research, which is significant for the future of the program.  All students in the program are required to take a basic research class that is also taken by all master’s degree students in the College of Education.  An additional assessment course is offered as an elective but is not commonly chosen by students due to interest in other elective options and the perceived amount of work that is associated with the assessment course.  Assessment is also taught in the final capstone course, but the exposure is brief and the course is not taken by all part-time students who instead develop a comprehensive portfolio as a capstone experience.  Assessment is an important skill used regularly by student affairs professionals and in many different student affairs offices.  The responses to the survey strongly suggested that graduates of the CSPA program feel deficient in regards to assessment.  Perhaps the assessment course should be a required component of the curriculum or the current research class could be modified to better prepare students to conduct assessment.  Another suggestion from a current student was to encourage practicum and campus administrators to ask graduate students to “assist with actual campus assessments and evaluations,” again amplifying the need students have to apply or use material in order to feel more competent.

Another weakness that was indirectly reported by participants in the exit interviews and open-ended responses on the surveys was the ability to see or understand the “bigger picture.”  They were comfortable with competencies learned in the classroom that directly related and then could be applied to their work or internship settings but not with areas such as philosophy or history that asked them to think beyond their day-to-day responsibilities for potential uses in practice.  One participant did comment about knowing now “where all of this came about” from the course on history and philosophy of student affairs, but typically others preferred when “supervisors drew connections” to implications of different student affairs philosophies and helped students make meaning of them in their work.  In addition, “bigger picture” thinking as a concept often relates to policy making in which graduate students or new professionals are not always involved.  By having positions where they are not generally involved with making policies, participants responding to the survey may not have been able to use those more conceptual competencies.

A particularly concerning response was that Law, Policy, and Governance was cited as the weakest area across all demographic breakdowns, even though there is a course devoted to legal issues for student affairs professionals.  This response may suggest a shortcoming in the effectiveness of the course or be a result of the subject itself.  One participant offered an explanation in support of the latter stating, “I feel that areas such as law require more time than our program would allow to receive a ‘most prepared’ rating.”  Another participant felt that even though they were exposed to relevant case law in the class, students did not understand how the specific policies were developed in response to the law.  It would be meaningful in the future for those connections to be made in the classroom and through outside experiences in ways that improve students’ skills in this area.

Future Research

The assessment instrument and exit interview are only two aspects of the competency development research that is planned.  In spring 2015, the team will partner with student affairs graduate program faculty at Eastern Illinois University and Western Illinois University.  These programs have theoretical and practical elements that are similar to the CSPA program at Illinois State University, and being able to compare the results between the three institutions would allow for more discussions and collaborations about our curriculum and shared program outcomes.

Another area for additional research is to further investigate specific assignments and projects completed in CSPA courses to determine whether or not students attribute learning from these various assignments to specific areas of competency development.  One way to do this would be to update the current course evaluation process and add a section that correlates with the assignments in each course.  Targeted questions could then be asked about individual assignments and the possible connections to the development of specific competencies.  The open-ended question about assignments that was asked in the current online instrument yielded varied and sporadic results regarding the types of learning that took place in different courses.  Even though it was an option on the survey, there were very few instances where participants indicated any information about detailed aspects of coursework or mentioned specific assignments that informed their development of particular competencies.  The information that could come from this proposed future research may further support the conclusions drawn from the current research and could also allow for more detailed recommendations for program improvement and curriculum revision.

Conclusions

Using the ACPA/NASPA Professional Competency Areas for Student Affairs Practitioners (2010), our research team created an assessment instrument and interview protocol, surveyed current students and recent alumni, and assessed the findings. The results were significant in initiating discussions among faculty for curriculum advancements to better prepare future and current student affairs practitioners.  Based on the information received through this assessment process, and through the enhanced process planned for spring 2015, we should be able to better support student learning within our CSPA master’s degree program at Illinois State University.

References

ACPA/NASPA. (2010). Professional competency areas for student affairs practitioners.

Cuyjet, M. J., Longwell-Grice, R., & Molina, E. (2009). Perceptions of new student affairs professionals and their supervisors regarding the application of competencies learned in preparation programs. Journal of College Student Development, 50(1), 104-119.

Dickerson, A. M., Hoffman, J. L., Anan, B. P., Brown, K. F., Vong, L. K., Bresciani, M. J., Monon, R., & Oyler, J. (2011). A comparison of senior student affairs officers and student affairs program faculty expectations of entry-level professionals’ competencies. Journal of Student Affairs Research and Practice, 48(4), 463-479.

Henning, G. W., Mitchell, A. A., & Maki, P. L. (2008). The assessment skills and knowledge standards: Professionalizing the work of assessing student learning and development. About Campus, 13(4), 11-17.

Herdlein, R. J. (2004). Survey of chief student affairs officers regarding relevance of graduate preparation of new professionals. NASPA Journal, 42, 51-71.

Hoffman, J. L. & Bresciani, M. J. (2010). Assessment work: Examining the prevalence and nature of assessment competencies and skills in student affairs job postings. Journal of Student Affairs Research and Practice, 47(4), 495-512.

Komives, S. R., & Smedick, S. (2012). Using standards to develop student learning outcomes. New Directions for Student Services, no. 140. Wiley Periodicals, Inc. 77-88.

Kuk, L. & Banning, J. (2009). Student affairs preparation programs: A competency based approach to assessment and outcomes. College Student Journal, 43(2), 492-502.

McCluskey-Titus, P., Ballard, K., Poffenbarger, C., Stott, R.(2013, November). Assessing learning in student affairs graduate programs using ACPA/NASPA Competencies. Presented at the annual meeting of the National Association of Student Personnel Administrators-Region IV-E, Skokie, IL.

McCluskey-Titus, P., Ballard, K., Poffenbarger, C., Stott, R. (2014, January). In the classroom and beyond: Assessing where learning takes place. Presented at the 14th Annual Symposium on Teaching and Learning, Illinois State University.

Renn, K. A., & Jessup-Anger, E. R. (2008). Preparing new professionals: Lessons for graduate preparation programs from the national study of new professionals in student affairs. Journal of College Student Development, 49(4), 319-335.

Waple, J. N. (2006). An assessment of skills and competencies necessary for entry-level student affairs work. NASPA Journal, 43, 1-18.

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s