Official Baja SAE Forums Homepage
Forum Home Forum Home > General > Lounge
  New Posts New Posts RSS Feed - Design Report Scoring
  FAQ FAQ  Forum Search   Events   Register Register  Login Login

Design Report Scoring

 Post Reply Post Reply
Author
Message
CLReedy21 View Drop Down
Baja Godfather
Baja Godfather
Avatar

Joined: Nov/30/2008
Location: Marysville, OH
Status: Offline
Points: 736
Post Options Post Options   Thanks (0) Thanks(0)   Quote CLReedy21 Quote  Post ReplyReply Direct Link To This Post Topic: Design Report Scoring
    Posted: Jun/29/2011 at 11:40am
Since the design report feedback sheets from Birmingham were emailed out to teams this year I figured it would be a good example to question how the design report scoring system is actually laid out.  Since the report graders varied between Birmingham and Kansas I won't compare across events since that's a whole 'nother can of worms, but I would like to point out some inconsistencies in the scoring between the 4 judge feedback sheets that were returned to us.

The report is broken down into 5 segments and on the instructions sheet the judges are given the following general instructions:
Originally posted by Design Report Score Sheet Instructions Design Report Score Sheet Instructions wrote:

Thank you for your support of the Baja SAE 2011 Competition. Please remember the importance of the Engineering Design aspect of this competition.  On the next work sheet, you will find the Design Score Sheet that was put together to help you expedite the design report judging process.  It has broken the total possible score for the design report down into several sections.  Each section has a portion of the total possible points associated with it.  All you have to do as judges is enter either a “1” or “0” in the first column.  The sheet will automatically score the report.  Award a "1" if you feel the report meets your expections.  Give a "0" if you do not feel the report is sufficent.  Please remember you need to judge the report on content and ask yourself, "Do the students back up their design with good engineering logic?".  Do not judge the cars based on your opinion of whether the design will work or not!  That responsibility will be left up to the on site competition design judges.  Each report will be independently judged two times so please expediently return your results to the Design Report Captain.  The following is a list of the report sections along with some basic questions that you should answer:


Of the 5 sections I'll post the scores we received as well as the overall score we received at competition.  For each section the scores are listed as Section (Judge ED, DWB, 8/SW, AMS)/ #possible so all scores are in order.

Analysis tools, Correct Engineering                                                 (8, 25, 25, 25) /25
Clarity/Organization                                                                        (4, 5, 5, 4) /5
Design, Engineering (Innovation, Design, Materials, Construction)   (22, 24, 22, 28) /30
Format, (professionalism)                                                               (7, 8, 7, 9) /10
Grammar/Spelling                                                                           (3, 5, 0, 5) /5
Total                                                                                              (44, 67, 59, 71) /75

The report I wrote received a 66.87 according to the final results, which is not a direct average of any score set that I can readily identify.  Rather I've been told that each judge has a weighting assigned to normalize all the judges to have some degree of equality throughout.

The problem I have is that the two score highlighted in red are way out of line with the scores assigned by the other judges and I can't figure out why. 

Since judge ED gave us the lowest score I'll start there:

I was given 0 points for the following categories that were, IMO, incorrect:
-Use of selection matrix? (There was one for shock selection)
-Use of correct engineering analysis? (Don't know what I did wrong here)
-Assumed loadings realistic?  (Loading were either drawn from DAQ data or a referenced SAE technical paper on Baja SAE vehicles)
-Source of data correctly referenced?  (Proper format was followed when referencing said technical paper)
-Body 10 pages or less?  (It is.)
-Figures and tables list?  (Right there in Appendix A)
-Font 12pt?  (The SAE template uses 10pt, as does my report.  How am I expected to use 12pt?)
-Symbols list?  (Right next to the Figures and Tables list in Appendix A)
-Team Members list?  (page 10, bottom of the page)
-Construction technique discussed?  (discussed fabrication processes for almost all systems)
-General material discussion?  (entire paragraph dedicated to choices of higher strength alloys at a cost penalty plus individual discussion of alternate material/construction processes throughout)
-Tables numbered? (there happen to be two tables shockingly labeled, Table 1: Strength to Weight Comparison, and Table 2: Shock Selection Matrix)
-Grammar (report was proof read by several "grammar nazis" including an English teacher and a PE.)

Even if you only add back the most blatantly wrong point deductions the report score jumps from a 44 to a 63!  And that's not even changing any of the engineering content, merely pointing out labels and the location of a few items.

Anybody else care to share their experiences?

-Chris Reedy
TTU Alumni
Fourwheeler Drawer



"Quick with the hammer, slow with the brain."
Back to Top
Soccerdan7 View Drop Down
Organizer
Organizer
Avatar

Joined: Sep/22/2010
Location: CA
Status: Offline
Points: 780
Post Options Post Options   Thanks (0) Thanks(0)   Quote Soccerdan7 Quote  Post ReplyReply Direct Link To This Post Posted: Jun/29/2011 at 4:07pm
Chris, we saw similar spread on the scores from Alabama. We entered 2 reports and received 2nd and either 8th or 9th for our 2 cars. Judge ED also gave us a hard time on black and white things that we had clearly done. His scores were 25 to 30 points lower for the same report. The other judges were fairly consistent in how points were awarded and although they each stressed different things, the scores all came out to be within a reasonable range of each other.

I am not sure how the report that got second overall could get a mid 40s score from one judge.
Danny

Cornell

(fall'07 - spring'12)
Former Captain / MEng / that guy with all the carbon
10 races, 7 top ten's, 2 overall wins
Back to Top
Dr. T (UAB advisor) View Drop Down
Bolt Sorter
Bolt Sorter
Avatar

Joined: Dec/31/2010
Location: Birmingham, AL
Status: Offline
Points: 3
Post Options Post Options   Thanks (0) Thanks(0)   Quote Dr. T (UAB advisor) Quote  Post ReplyReply Direct Link To This Post Posted: Jun/30/2011 at 12:43pm

Chris, I understand your frustration as my team has been there many times including this year. I can explain at least part of the questions since I helped set up the design scoring volunteers. This experience was quite an eye opener for me as a team advisor as to how the design scoring system works. I will say the system is not perfect and I think the best solution we can hope for with this system is that someone will step up and start judging reports for all 3 competitions like the cost report system or tech inspection. Until that happens we have this system.

The score on the ten page limit and the score for the lack of list of figures are related. When we started looking at the score sheets when they arrived the question of how to score the appendix arose so we communicated with SAE as to what was allowed in the appendix. We were told that question had not been asked before and with discussion between SAE, Adam Husseman, and I it was agreed that the appendix should not contain any text tables or text as stated in the rules. So 10 page limit was not adhered to when the list of figures was in the appendix. The judges were also instructed not to score anything outside the 10 page limit if it was supposed to be in the text (i.e. the 0 for list of figures).

 

The second issue is that some of the judges were first time judges for Baja due to the volunteers we had. So some judges were harder than others depending on their experience with paper reviews. For example judge ED has reviewed conference papers on several occasions so he judged the reports by the standards he used for that. Where as someone who reads reports in an industry setting is looking more at was the report well written and did you explain what you were talking about and will not be as rigorous. In addition we had a very wide range of experience and disciplines in our judging pool which is why you had an emphasis on different areas. We had mechanical, materials, and electrical engineers as well as physics, industrial distribution and a range of other majors. I would think this is true at most competitions but maybe not. In addition we had several industry reps and experts that were in manufacturing, casting and composites areas. Those tended to catch alot of the assumptions and errors the teams discussed about those areas. So they would score a report section alot harder for that than someone who was not in that field.  The volunteers were distributed across the reports so that a variety of majors saw each report. The idea was to give you a range of feedback for the reports. All of the scores for the reports were reviewed for consistency by a final review committee of experienced Baja judges because there was such a wide spread of scores. The scores were also normalized so as long as the judge was consistent on all of the reports normalizing the scores accounted for it. We did not see any inconsistencies for a single judge. That is how you can receive 2nd with a very low score.

 
The next issue is a lot of our judges were not even Baja alumni, so they elevated the scores for innovation and design, where as someone who had a lot of Baja experience would score that lower. A good solution for this issue would be to get more Baja alumni to volunteer to judge design reports. Nobody wants to do that part because they do not want to read the reports just like nobody wants to write them. So we get what we get as far as judges go. Most of the alumni we did get were because of a similar report score experience. They remembered the aggravation and decided to come back and assist with the issue. Those judges did an excellent job of giving clear feedback and helpful suggestions. 
 
I hope this helps to explain at least some of your questions. If you have more I'll try to answer them.

 

"Volunteers do not necessarily have the time; they just have the heart."

Back to Top
Laxtondt View Drop Down
Bolt Sorter
Bolt Sorter
Avatar

Joined: Jun/30/2011
Status: Offline
Points: 26
Post Options Post Options   Thanks (0) Thanks(0)   Quote Laxtondt Quote  Post ReplyReply Direct Link To This Post Posted: Jun/30/2011 at 5:28pm
That does clear up most of why some of the scores seemed mighty low, even if I don't completely agree with how it was done.  Still its good to know there was method to the madness.

Although still the counting off for the team members when they are in the report is a careless and avoidable error which probably should've been found by the 2nd judge.
Tyrell Laxton
TTU Baja (guy with the hair)
www.ttubaja.com
Back to Top
Dr. T (UAB advisor) View Drop Down
Bolt Sorter
Bolt Sorter
Avatar

Joined: Dec/31/2010
Location: Birmingham, AL
Status: Offline
Points: 3
Post Options Post Options   Thanks (0) Thanks(0)   Quote Dr. T (UAB advisor) Quote  Post ReplyReply Direct Link To This Post Posted: Jun/30/2011 at 5:39pm

I am sorry for the oversite but it should not have been reflected in the final score. I will assume the judge was looking for the list of team members on the cover page or at the top of the report and scored that prior to reading the report. Most of the other info at the top was on or near the cover and page 1 of the report. The judges never saw each others sheets nor did they discuss the reports. This was in an attempt to keep them from influencing each others scores. Maybe group discussions and scoring may have been a better approach. The final review committee would have caught the discrepancy and it should not have been reflected in the final score since 3 of the 4 judges did not count off for it.

"Volunteers do not necessarily have the time; they just have the heart."

Back to Top
Laxtondt View Drop Down
Bolt Sorter
Bolt Sorter
Avatar

Joined: Jun/30/2011
Status: Offline
Points: 26
Post Options Post Options   Thanks (0) Thanks(0)   Quote Laxtondt Quote  Post ReplyReply Direct Link To This Post Posted: Jul/01/2011 at 7:43am
Although please don't think I'm trying to be harsh, I mean it does peev me just a little since it was my team.  However, I'd like to see this become a preventative issue, and help better the competition so as not to happen again.
 
Perhaps this could lead to suggestions for changing the rules regarding the design report even, such as color code the important sections. 
Tyrell Laxton
TTU Baja (guy with the hair)
www.ttubaja.com
Back to Top
kcsvoboda View Drop Down
Bolt Sorter
Bolt Sorter
Avatar

Joined: Jul/01/2011
Location: Illinois
Status: Offline
Points: 2
Post Options Post Options   Thanks (0) Thanks(0)   Quote kcsvoboda Quote  Post ReplyReply Direct Link To This Post Posted: Jul/08/2011 at 5:38pm
We definitely got the comment that we "needed to include figures and tables in the text" at Peoria and had two scores that were very different, with comments that contradicted each other.  One reader loved things that the other hated.  I realize it could be worse, if we had two readers that both disliked our report for minor issues.

That being said, although this may be tedious and require more effort by volunteers, it might be wise to adopt a system like the written portion of the GRE, where it is scored by two readers on a scale of 1-6.  If the two scores differ by more than 1 point, then a third reader's score is used, if the two are within 1 point, they are averaged.  In the case of Baja, a 10 point difference could be used.
Kat Svoboda

University of Illinois at Urbana-Champaign
Off-Road Illini Team Captain '11
Back to Top
schooter View Drop Down
Organizer
Organizer
Avatar

Joined: Feb/22/2010
Status: Offline
Points: 224
Post Options Post Options   Thanks (0) Thanks(0)   Quote schooter Quote  Post ReplyReply Direct Link To This Post Posted: Nov/15/2011 at 4:12pm
Hey Dr. T, do you by chance have the contact info of who to contact for volunteering. If you do can you please post it here so all of the alumni on here can see it? I've tried volunteer@sae.org without hearing anything back (not sure if someone is just on vacation or something).

Thanks
Chase Schuette
https://www.linkedin.com/in/chaseschuette/
Back to Top
 Post Reply Post Reply
  Share Topic   

Forum Jump Forum Permissions View Drop Down



This page was generated in 0.141 seconds.