[game_edu] game_edu Digest, Vol 90, Issue 4

McGill, Monica mmcgill at bumail.bradley.edu
Thu Jan 5 10:13:28 EST 2012




Bob,

I've been refining my eval system for years. It still isn't "there," but I
feel it is getting closer.

I've divided my projects into sprints for each phase of design/development.
At the end of each sprint, students fill out a brief peer and self eval
form. I use this to help me guide my grading of each student. These forms
are confidential and no one other than myself sees them.

Students receive grades after each sprint. This serves as formative feedback
for them, so they can correct their behavior earlier in the project (or
prompt them to drop the class).

We also use Assembla for project management tracking so I can see who is
assigned what each sprint and whether or not they are completing their
tasks. Also, students use Google Docs for reports. In Google Docs, I can
tell from the history who is contributing more versus less. (This came in
handy last semester, when I docked a student for lack of contributions. When
he asked why because he contributed to the document, I was able to state
that he added 40 words to a 10 page document and that his other 6 team
members obviously contributed significantly more.)

As far as individual/team split, I've tried a number of splits. After
talking in depth with students this past semester, their ideal split is to
see 70% individual grade and 30% team grade. They are exhausted of watching
their peers who contribute little or contribute with very poor quality work
continue to get nearly the same grades as the hardworking students.

If anyone is interested in my peer/self evals, please email me directly.

Hope this helps,

Monica

Blog: http://www.virtuallyfine.com
Twitter: virtuallyfine


> ----------------------------------------------------------------------

>

> Message: 1

> Date: Wed, 4 Jan 2012 08:09:43 -0700

> From: "Robert R. Kessler" <kessler at cs.utah.edu>

> Subject: [game_edu] Evaluating individual students on semester long

> project classes

> To: game_edu at igda.org

> Message-ID: <5CC6599F-A2AE-48FE-9C24-083BAC785826 at cs.utah.edu>

> Content-Type: text/plain; charset=us-ascii

>

> We have been teaching semester long video game development project classes for

> years (two semesters in our capstone class and also in our master's program).

> We have tried a whole bunch of different techniques to try and evaluate the

> performance of individual students within the team. For anyone that has

> taught these classes, there are always a handful of students on each team that

> work their tails off and contribute most of the code or art assets to the

> project. Many students put in the time that is required of the class (we use

> the rough formula of 3 hours per week per credit hour of the class) and get

> their tasks done, but because of other commitments never do any more. Lastly

> there are the handful that just don't get much done. The question is how do

> you set up an evaluation system that fairly evaluates these students and gives

> them the appropriate grades. For example, when you have the middle tier

> students who do their work, but nothing extra, do they deserve A's?

>

> They have done everything that you asked of them? Then what do you do for

> the over achievers?

>

> The techniques that we've tried are (note, we typically have an area lead

> appointed over engineering, arts, and design, and then a team lead)

>

> 1) Have each area leader report whether each student did their work and how

> well they did it (we've tried evaluating them as: C - if they did their job, B

> - if they did it and did an excellent job, A - if they did B work and then

> went above and beyond and did more - as in worked on extra items from the

> sprint backlog). For the area leads, they are evaluated by the team lead and

> then the team lead is evaluated by the area-leads.

>

> 2) Have each student create a gamer blog and include in it each week, what

> they were assigned to do. Then make an entry at the end of the week with what

> they accomplished and to show evidence. We the teaching staff then go in and

> evaluate it weekly.

>

> Neither of these have been quite satisfactory.

>

> Note from an experience point of view, it is certainly true that in business

> some folks work harder than others. Management usually knows who those people

> are, mostly because they are all in the same environment all week long,

> whereas we are stuck with only really interacting with the students during

> class time. Management can give raises, bonuses, whatever. But the limited

> contact time is one key problem.

>

> So, is there a technique that you have used that you feel works really well

> for evaluating individual student performance when working on a long term

> project?

>

> Thanks.

> Bob.

>

>

>

> ------------------------------

>

> Message: 2

> Date: Wed, 4 Jan 2012 10:25:44 -0600

> From: Peter Border <pborder at msbcollege.edu>

> Subject: Re: [game_edu] Evaluating individual students on semester

> long project classes

> To: IGDA Game Education Listserv <game_edu at igda.org>

> Message-ID:

> <58F0FCE57E8B574686CF5742D5D5BA0E3DD391DA73 at BOUVIER.msb.priv>

> Content-Type: text/plain; charset="us-ascii"

>

> We have a 12-week intensive course that does the same sort of thing. I grade

> people based on a two-page report where they grade each other (not just the

> manager, but everybody), and based on what I've seen during class. It's

> usually very clear who's working hard and who's not.

>

>

>

> Peter Border

> Game and Application Design Chairman

> Globe University/Minnesota School of Business

> 1401 West 76th St

> Richfield, MN 55423

> pborder at msbcollege.edu

> ________________________________________

> From: game_edu-bounces at igda.org [game_edu-bounces at igda.org] On Behalf Of

> Robert R. Kessler [kessler at cs.utah.edu]

> Sent: Wednesday, January 04, 2012 9:09 AM

> To: game_edu at igda.org

> Subject: [game_edu] Evaluating individual students on semester long project

> classes

>

> We have been teaching semester long video game development project classes for

> years (two semesters in our capstone class and also in our master's program).

> We have tried a whole bunch of different techniques to try and evaluate the

> performance of individual students within the team. For anyone that has

> taught these classes, there are always a handful of students on each team that

> work their tails off and contribute most of the code or art assets to the

> project. Many students put in the time that is required of the class (we use

> the rough formula of 3 hours per week per credit hour of the class) and get

> their tasks done, but because of other commitments never do any more. Lastly

> there are the handful that just don't get much done. The question is how do

> you set up an evaluation system that fairly evaluates these students and gives

> them the appropriate grades. For example, when you have the middle tier

> students who do their work, but nothing extra, do they deserve A's?

> They have done everything that you asked of them? Then what do you do for

> the over achievers?

>

> The techniques that we've tried are (note, we typically have an area lead

> appointed over engineering, arts, and design, and then a team lead)

>

> 1) Have each area leader report whether each student did their work and how

> well they did it (we've tried evaluating them as: C - if they did their job, B

> - if they did it and did an excellent job, A - if they did B work and then

> went above and beyond and did more - as in worked on extra items from the

> sprint backlog). For the area leads, they are evaluated by the team lead and

> then the team lead is evaluated by the area-leads.

>

> 2) Have each student create a gamer blog and include in it each week, what

> they were assigned to do. Then make an entry at the end of the week with what

> they accomplished and to show evidence. We the teaching staff then go in and

> evaluate it weekly.

>

> Neither of these have been quite satisfactory.

>

> Note from an experience point of view, it is certainly true that in business

> some folks work harder than others. Management usually knows who those people

> are, mostly because they are all in the same environment all week long,

> whereas we are stuck with only really interacting with the students during

> class time. Management can give raises, bonuses, whatever. But the limited

> contact time is one key problem.

>

> So, is there a technique that you have used that you feel works really well

> for evaluating individual student performance when working on a long term

> project?

>

> Thanks.

> Bob.

>

> _______________________________________________

> game_edu mailing list

> game_edu at igda.org

> http://seven.pairlist.net/mailman/listinfo/game_edu

>

>

>

>

> ------------------------------

>

> Message: 3

> Date: Wed, 4 Jan 2012 16:29:24 +0000

> From: WEARN Nia H <N.H.Wearn at staffs.ac.uk>

> Subject: Re: [game_edu] Evaluating individual students on semester

> long project classes

> To: 'IGDA Game Education Listserv' <game_edu at igda.org>

> Message-ID:

> <7EBE8096B1A1B24C89CA3A7EF8ECFE0F71C15D00CA at CRWNMAIL.staff.staffs.ac.uk>

>

> Content-Type: text/plain; charset="us-ascii"

>

> We have similar modules, and similar issues to be honest.

>

> We taken to having a 70/30 split

> - 70 % - Group work mark: (so the assets we ask them make, milestoned

> throughout the semester)

> - 30 % - Individual mark: Made up from forum posts and a self evaluation. We

> look at the quality of posts, timeliness - ideally they should be updated

> weekly etc.

>

> As our groups have gotten bigger, with more students, to incorporate a wider

> range of skills etc - we've moved away from lab times and more to making sure

> they have time to meet up once a week in a meeting room, in which the tutors

> pop in to check on progress.

>

> We used to peer assess, and it just got messy, so we've moved away from that -

> it was always a flawed system, so it will be interesting to see how this 70/30

> split works.

>

> I'll report back when I can bare to look at the marking...

>

> Nia

>

> -----Original Message-----

> From: game_edu-bounces at igda.org [mailto:game_edu-bounces at igda.org] On Behalf

> Of Robert R. Kessler

> Sent: 04 January 2012 15:10

> To: game_edu at igda.org

> Subject: [game_edu] Evaluating individual students on semester long project

> classes

>

> We have been teaching semester long video game development project classes for

> years (two semesters in our capstone class and also in our master's program).

> We have tried a whole bunch of different techniques to try and evaluate the

> performance of individual students within the team. For anyone that has

> taught these classes, there are always a handful of students on each team that

> work their tails off and contribute most of the code or art assets to the

> project. Many students put in the time that is required of the class (we use

> the rough formula of 3 hours per week per credit hour of the class) and get

> their tasks done, but because of other commitments never do any more. Lastly

> there are the handful that just don't get much done. The question is how do

> you set up an evaluation system that fairly evaluates these students and gives

> them the appropriate grades. For example, when you have the middle tier

> students who do their work, but nothing extra, do they deserve A's?

> They have done everything that you asked of them? Then what do you do for

> the over achievers?

>

> The techniques that we've tried are (note, we typically have an area lead

> appointed over engineering, arts, and design, and then a team lead)

>

> 1) Have each area leader report whether each student did their work and how

> well they did it (we've tried evaluating them as: C - if they did their job, B

> - if they did it and did an excellent job, A - if they did B work and then

> went above and beyond and did more - as in worked on extra items from the

> sprint backlog). For the area leads, they are evaluated by the team lead and

> then the team lead is evaluated by the area-leads.

>

> 2) Have each student create a gamer blog and include in it each week, what

> they were assigned to do. Then make an entry at the end of the week with what

> they accomplished and to show evidence. We the teaching staff then go in and

> evaluate it weekly.

>

> Neither of these have been quite satisfactory.

>

> Note from an experience point of view, it is certainly true that in business

> some folks work harder than others. Management usually knows who those people

> are, mostly because they are all in the same environment all week long,

> whereas we are stuck with only really interacting with the students during

> class time. Management can give raises, bonuses, whatever. But the limited

> contact time is one key problem.

>

> So, is there a technique that you have used that you feel works really well

> for evaluating individual student performance when working on a long term

> project?

>

> Thanks.

> Bob.

>

> _______________________________________________

> game_edu mailing list

> game_edu at igda.org

> http://seven.pairlist.net/mailman/listinfo/game_edu

>

> The information in this email is confidential and is intended solely for the

> addressee. Access to this email by anyone else is unauthorised.

>

> If you are not the intended recipient, any disclosure, copying, distribution

> or any action taken or omitted to be taken in reliance on it, except for the

> purpose of delivery to the addressee, is prohibited and may be unlawful.

> Kindly notify the sender and delete the message and any attachment from your

> computer.

>

>

> ------------------------------

>

> Message: 4

> Date: Wed, 04 Jan 2012 15:16:40 -0300

> From: "Anibal Menezes" <amenezes at imagecampus.com.ar>

> Subject: Re: [game_edu] Evaluating individual students on semester

> long projectclasses

> To: "IGDA Game Education Listserv" <game_edu at igda.org>

> Message-ID:

> <CHILKAT-MID-7f7f5201-d783-b2d3-99fa-7f0bdbe9800e at design1.empresa2000.local>

>

> Content-Type: text/plain; charset="us-ascii"

>

> An HTML attachment was scrubbed...

> URL:

> <http://seven.pairlist.net/pipermail/game_edu/attachments/20120104/88c0a531/at

> tachment-0001.htm>

>

> ------------------------------

>

> Message: 5

> Date: Wed, 4 Jan 2012 20:14:13 -0500

> From: cindyPoremba <cindy at docgames.com>

> Subject: [game_edu] Call for Submissions to the Research and

> Experimental Game Festival (FDG 2012)

> To: game_edu at igda.org

> Message-ID: <761EAD82-8FB9-4934-8B31-DFEEF78BCABB at docgames.com>

> Content-Type: text/plain; charset=windows-1252

>

> Please feel free to forward this Call for Submissions to anyone you feel

> may/should be interested.

>

> Cheers,

> Cindy Poremba

> Postdoctoral Researcher, Georgia Tech

> Chair, Foundations of Digital Games 2012 Research and Experimental Game

> Festival

>

> ------------------

>

> Call for Submissions to the Research and Experimental Game Festival

>

> Foundations of Digital Games 2012

> May 29-June 1, 2012

> Raleigh, North Carolina

>

>

> Important Dates

>

> Research and Experimental game Festival Submission:19 January 2012

> Research and Experimental Game Festival Notification: 01 March 2012

>

>

> Research and Experimental Games Festival Submissions

>

> The Festival is designed to showcase playable games that are experimental or

> have a research component. Submitted games could be significant because they

> are designed to answer a research question or experiment with the design

> process, or because their technological components represent research

> advancements. Works in progress are permitted, but the game will ideally

> include at least one playable level (or comparable unit of play time). Works

> that have not yet reached this stage may be more suitable for the conference

> demo track.

>

> Submissions should also include a 2-4 page writeup of the project which

> addresses requirements (technical and otherwise) needed for demonstrating the

> game at FDG. The text should outline the game?s research context, and how the

> work demonstrates rigor in methodology and a contribution to knowledge.

> Submissions should also include a link to the game, and/or substantive

> documentation, hosted on your own server or one of your choosing.

>

> We welcome and encourage works exploring a variety of disciplinary approaches

> and methodologies, including interdisciplinary collaborations. It is the

> responsibility of the contributor to ensure all necessary information is

> accessible at all times during the judging period (19 January 2012 to 01

> March 2012).

>

> Games will be peer reviewed by an international panel comprised of academics,

> artists, and game designers. The Festival accepts works at all stages of

> publishing, regardless of source funding, provided the work clearly

> demonstrates an advancement to current and/or ongoing research. Works

> previously submitted to other festivals or exhibitions are permitted.

>

> General Submission Guidelines

>

> The 2-4 page writeup must be in either PDF or DOC format, and comply with the

> official ACM proceedings format using one of the templates provided at

> http://www.acm.org/sigs/pubs/proceed/template.html.

>

> Submissions must be made via EasyChair at

> https://www.easychair.org/conferences/?conf=fdg2012

>

> If you have any questions or problems, please do not hestitate to contact the

> Festival Chair Cindy Poremba at cindy @ docgames dot com.

>

> ------------------------------

>

> _______________________________________________

> game_edu mailing list

> game_edu at igda.org

> http://seven.pairlist.net/mailman/listinfo/game_edu

>

>

> End of game_edu Digest, Vol 90, Issue 4

> ***************************************




More information about the game_edu mailing list