[game_edu] game_edu Digest, Vol 90, Issue 7

Stephen Jacobs itprofjacobs at gmail.com
Sat Jan 7 10:15:37 EST 2012


I also combine multiple techniques. Peer evals are sometimes helpful, but most often only back up on paper what my observations have told me, even so they are helpful in a documentation aspect. Dev blogs are more useful, but are most effective when each students responsibility is clearly defined.

Upfront work in clearly defining individual student strengths and weaknesses up front is also key.

In terms of adding more industry realism, I have at times shuffled team members across projects and/or demoted a team leader and promoted another from within if needed

Sent from my iPhone

On Jan 7, 2012, at 10:00 AM, "game_edu-request at igda.org" <game_edu-request at igda.org> wrote:


> Send game_edu mailing list submissions to

> game_edu at igda.org

>

> To subscribe or unsubscribe via the World Wide Web, visit

> http://seven.pairlist.net/mailman/listinfo/game_edu

> or, via email, send a message with subject or body 'help' to

> game_edu-request at igda.org

>

> You can reach the person managing the list at

> game_edu-owner at igda.org

>

> When replying, please edit your Subject line so it is more specific

> than "Re: Contents of game_edu digest..."

>

>

> ----------------------------------------------------------------------

> IGDA Education SIG

> ----------------------------------------------------------------------

>

> Today's Topics:

>

> 1. Re: Evaluating individual students on semester long

> (Robert R. Kessler)

> 2. Re: Evaluating individual students on semester long project

> classes (Bangsberg, Keld)

>

>

> ----------------------------------------------------------------------

>

> Message: 1

> Date: Fri, 6 Jan 2012 09:13:14 -0700

> From: "Robert R. Kessler" <kessler at cs.utah.edu>

> Subject: Re: [game_edu] Evaluating individual students on semester

> long

> To: game_edu at igda.org

> Message-ID: <294CAEEA-5734-459D-96E7-34018ADC4ACE at cs.utah.edu>

> Content-Type: text/plain; charset=us-ascii

>

> Casey makes an excellent point. Our team sizes range from 10 to 15 as we attempt to create an environment which causes more team issues to surface, make leadership more important, and ultimately make something that is more like the real world (about the only thing that we could do better would be to have our student teams work with teams from some other location, which would certainly increase the realism). When we have small teams, the issues are much less difficult.

>

> BTW - Thanks to everyone for all of your suggestions and pointing out the research. We really appreciate it.

>

> Bob.

>

>>

>>

>> Message: 3

>> Date: Fri, 6 Jan 2012 02:52:24 +0000

>> From: Casey ODonnell <caseyod at uga.edu>

>> Subject: Re: [game_edu] Evaluating individual students on semester

>> long

>> To: IGDA Game Education Listserv <game_edu at igda.org>

>> Message-ID: <CB2BCACE.5824%caseyod at uga.edu>

>> Content-Type: text/plain; charset="us-ascii"

>>

>> Define "large groupwork" in this study.... Because I typically shoot for 4 person teams... So low visibility isn't really an issue. I couldn't really find the typical team size in the reports.

>>

>> Casey

>>

>> From: Yusuf Pisan <yusuf.pisan at uts.edu.au<mailto:yusuf.pisan at uts.edu.au>>

>> Reply-To: IGDA Game Education Listserv <game_edu at igda.org<mailto:game_edu at igda.org>>

>> Date: Fri, 6 Jan 2012 10:27:07 +1100

>> To: IGDA Game Education Listserv <game_edu at igda.org<mailto:game_edu at igda.org>>

>> Subject: Re: [game_edu] Evaluating individual students on semester long

>>

>> The visibility of individual work

>> on the project provided by the online time records improved the

>> situation by reducing the percentage of groups with a near equal mark

>> allocation to about 55%. This result proved that reliable evidence of

>> individual efforts empowered team members to claim better marks, and

>> the groups were willing to accept resulting mark differentiation. The

>> most significant change in peer assessment mark distribution occurred

>> with the introduction of the current TeCTra system that has

>> facilitated peer evaluation, feedback and review assessment processes.

>> -------------- next part --------------

>> An HTML attachment was scrubbed...

>> URL: <http://seven.pairlist.net/pipermail/game_edu/attachments/20120106/e57c6bb6/attachment-0001.html>

>>

>> ------------------------------

>>

>> _______________________________________________

>> game_edu mailing list

>> game_edu at igda.org

>> http://seven.pairlist.net/mailman/listinfo/game_edu

>>

>>

>> End of game_edu Digest, Vol 90, Issue 6

>> ***************************************

>>

>

>

>

> ------------------------------

>

> Message: 2

> Date: Fri, 6 Jan 2012 19:38:47 -0500

> From: "Bangsberg, Keld" <kbangsberg at aii.edu>

> Subject: Re: [game_edu] Evaluating individual students on semester

> long project classes

> To: IGDA Game Education Listserv <game_edu at igda.org>

> Message-ID:

> <0FB69FC5D959BD4480310916A79CD65B042793D5C7 at CSREXCMSADM04.admin.edmc.adm>

>

> Content-Type: text/plain; charset="ISO-8859-1"

>

> It has been interesting to read the different responses. I use a blend of a few different approaches:

>

> I have established a very basic rubric that outlines expectations across a number of parameters, such as quality of work, communication, meets deadlines, etc. I grade each student and everyone does both a self evaluation and an evaluation of their peers using this framework (on a scale of 1-10 for each item). I use the self and peer evaluations as a reference to see if my perception/observations align with that of the students. Typically my numbers fall between the self eval and that of their peers. I then make adjustments to my original scoring as I see fit.

>

> We do this at both midterms and finals so that there are at least two points of reference.

>

> - Keld

>

> Keld Bangsberg

> Academic Department Director

> * Game Art & Design

> * Media Arts & Animation

> * Visual & Game Programming

>

> Ai - The Art Institute of Portland

> 1122 NW Davis ~ Portland, OR 97209-2911

> kbangsberg at aii.edu

> 503.382.4749

>

>

>

> -----Original Message-----

> From: game_edu-bounces at igda.org [mailto:game_edu-bounces at igda.org] On Behalf Of Robert R. Kessler

> Sent: Wednesday, January 04, 2012 7:10 AM

> To: game_edu at igda.org

> Subject: [game_edu] Evaluating individual students on semester long project classes

>

> We have been teaching semester long video game development project classes for years (two semesters in our capstone class and also in our master's program). We have tried a whole bunch of different techniques to try and evaluate the performance of individual students within the team. For anyone that has taught these classes, there are always a handful of students on each team that work their tails off and contribute most of the code or art assets to the project. Many students put in the time that is required of the class (we use the rough formula of 3 hours per week per credit hour of the class) and get their tasks done, but because of other commitments never do any more. Lastly there are the handful that just don't get much done. The question is how do you set up an evaluation system that fairly evaluates these students and gives them the appropriate grades. For example, when you have the middle tier students who do their work, but nothing extra, do they deserve A's?

> They have done everything that you asked of them? Then what do you do for the over achievers?

>

> The techniques that we've tried are (note, we typically have an area lead appointed over engineering, arts, and design, and then a team lead)

>

> 1) Have each area leader report whether each student did their work and how well they did it (we've tried evaluating them as: C - if they did their job, B - if they did it and did an excellent job, A - if they did B work and then went above and beyond and did more - as in worked on extra items from the sprint backlog). For the area leads, they are evaluated by the team lead and then the team lead is evaluated by the area-leads.

>

> 2) Have each student create a gamer blog and include in it each week, what they were assigned to do. Then make an entry at the end of the week with what they accomplished and to show evidence. We the teaching staff then go in and evaluate it weekly.

>

> Neither of these have been quite satisfactory.

>

> Note from an experience point of view, it is certainly true that in business some folks work harder than others. Management usually knows who those people are, mostly because they are all in the same environment all week long, whereas we are stuck with only really interacting with the students during class time. Management can give raises, bonuses, whatever. But the limited contact time is one key problem.

>

> So, is there a technique that you have used that you feel works really well for evaluating individual student performance when working on a long term project?

>

> Thanks.

> Bob.

>

> _______________________________________________

> game_edu mailing list

> game_edu at igda.org

> http://seven.pairlist.net/mailman/listinfo/game_edu

>

> ----------------------------------------------------------------------

> CONFIDENTIALITY NOTICE: This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to which they are addressed. If you are not the intended recipient, you may not review, copy or distribute this message. If you have received this email in error, please notify the sender immediately and delete the original message. Neither the sender nor the company for which he or she works accepts any liability for any damage caused by any virus transmitted by this email.

>

>

> ------------------------------

>

> _______________________________________________

> game_edu mailing list

> game_edu at igda.org

> http://seven.pairlist.net/mailman/listinfo/game_edu

>

>

> End of game_edu Digest, Vol 90, Issue 7

> ***************************************



More information about the game_edu mailing list