DIEF

The design institutes of India often face a question – Is there any framework for measuring best practices for design education and quality outcomes of students? The answer to this question is ‘NO’. there are more questions around the previous one, which is – how do you measure the design qualities and learning outcomes? What is the best practice(s) in design education? What is the quality of design education? What is the standard of design colleges in India?   

To answer these questions, as a quick solution we have a recently launched – Design Institute Evaluation Framework (DIEF). Drona Seeker pleasantly hosts the Design Institute Evaluation Framework (DIEF) to help design institutes, maintain the minimum standard of Design Education in India. the DIEF is also established with the goal of nation building to support and promotions of Make-in-India, Start-up India and New Education Policy of India. According to this framework, a design institute can be evaluated based on the following measures –

  1. Outcome-Based Learning Practices (OBLP) in Design Education- 30% – Quantitative and Qualitative Measures
  2. Infrastructures  for Design Education (IDE) – 10%  Quantitative and Qualitative Measures
  3. Design Research Outcomes (DRO) – 30% – Quantitative and Qualitative Measures
  4. Design Culture and Social Contributions (DCSC) – 20% – Quantitative and Qualitative Measures
  5. Placement Records (PR) – 10% –– Quantitative Measure

Important Benchmarks

Obtained Score      Grade          Impression 

91-100%                  D +++          Excellent Design School

81-90%                    D++             Very Good Design School

71-80%                    D+               Good Design School

61-70%                    D                 Average Design School (many interventions required)

<60%                       D-                Poor Design School

1. Outcome-Based Learning Practices (OBLP) in Design Education – (Weightage: 30%) 

The OBLP refers to the measurement of the attainment of design learning according to learning objectives as a result of best practices in design mentoring. To measure the OBLP we need to map programme outcomes of a design programme with course outcomes of every course under a particular design programme. After that faculty or programme designers should prepare course contents and map assignments or theoretical test questions. Then, we need to set a benchmark for the outcome of a particular course in terms of skill/knowledge should gain by students e.g. 50% of students in a batch should score more than 50% in a class according to a fixed set of suitable rubrics to assess the design students. After the actual assessment of a course a faculty should calculate the attainment and crosscheck the deviation from the preset benchmark on the basis of obtained marks by students opting for a particular course. A faculty or programme designer should also document the best mentoring pedagogy (e.g. gamified learning, experiential learning etc.) applied during a particular course. The OBLP should be calculated for all the courses under a program and then, the cumulative reflections should be monitored by an expert team of design educators. The teacher-student ratio should not exceed 1:15. Otherwise get less marks in this section. 
# Quantitative (40%) and Qualitative Measures (60%)

2. Infrastructures for Design Education (IDE) – (Weightage: 20%) The IDE is based on the kind of programme offered by a design school. For instance, the product design undergraduate (UG) programme requires Mechanical Workshop facilities, Digital Drawing Studio, CAD Modelling studio, Clay Modeling Studio, Painting Studio, etc. whereas, the user experience design undergraduate programme requires – a computer lab with digital prototyping software and visual design software, immersive experience (AR/VR) studio, tangible interface design lab, human factors, and usability testing lab, etc. Safety guidelines and software or tools user guidelines should be displayed in all labs, studios, and workshops. All facilities should be developed as per the student’s strengths. The computer-student ratio should not be greater than 1:10. All classrooms should be ICT enabled. The design school’s physical infrastructure should be designed so that it is perceived as a design habitat.   


# Quantitative (20%) and Qualitative Measures (80%)
3. Design Research Outcomes (DRO) –  (Weightage: 30%) 

Research activities of a design school should be presented for the last 5 years. The DROs are listed below –

  • Design Research Papers: Number of articles published in design allied Journals/ Book Series s by each faculty member
  • Intellectual Properties: Design Registrations, Trademarks, Copyright, etc.  s by each faculty member
  • Total Funds Awarded for Research Activities: funds received from external funding agencies (70%) and funds invested by the institute (30%) per faculty member and students
  • Total Funds Received by Faculty Members through Consultancies: funds received from external funding agencies per faculty members
  • Design Research Papers in Conferences: Number of articles published in Proceedings by each faculty member
  • Faculty Development Programmes Attended: Number as per the number of faculty members
  • Overall Citations plot over  5 years (for all faculties and students)
  • Overall H10 index (for all faculties and students) 

# Quantitative (80%) and Qualitative Measures (20%)
4. Design Culture and Social Contributions (DCSC) – (Weightage: 20%) The DCSC practices are depending on the following parameters – 

  • Short-term (e.g. exposing or sensitizing to good design in a context and timeframe) and long-term Design Orientations (e.g. participation in design conferences, seminars, design dialogues, design competitions etc.) – how many students participated?
  • Design and Crafting Habitat Creation – how many facility or infrastructure or object created ? how many students and faculty members ? 
  • Gender based design and representation how many designs? Utility record? 
  • Design Catering to Local Requirements how many designs? Utility record? 
  • Design Catering to Business Requirements how many designs? Utility record? 
  • Co-creation and Co-design Practices how many designs? Utility record? 
  • Design for Uncertainties (e.g. disasters, recessions, etc.)  how many designs? Utility record? 
  • Individual creation and Peer-collaboration how many design projects? how many students? 
  • Contributions in Economic Growth of individual / country how many designs? Utility record? 
  • Sustainable livelihood / SDGs etc. how many design projects? Utility record? 

# Quantitative (80%) and Qualitative Measures (20%)
5. Placement Records (PR)  – (Weightage: 10%) The PR score depends on –

  • Percentages (%) of total students placed in every academic year at a design school
  • The median salary of the students placed in every academic year at a design school

# Quantitative (100%) and Qualitative Measures (0%)
Design School Compliance Check ProgramEvery institute should conduct this programme and validate all assessment results by calling external experts. You can also create a team and self-assess your design school and then validate the assessment with external design experts. The internal or external assessment team may be constituted as follows –

  • Chair Person (Generally Dean/VC of the School/ University or Nominee)
  • Internal Academic Expert 1 (Senior Faculty having Knowledge of Academic Compliances/ Head of the Department of Same Domain of Design) 
  • External Academic Expert 2 (Senior Faculty having Knowledge of Academic Compliances and Same Domain of Design/ Head of the Department)
  • External Academic Expert 3 (Internal Senior Faculty having Knowledge of Academic Compliances/ Head of the Department of Any Domain of Design) 

***Experts should crosscheck evidence for all produced data.
At least 3 experts should assign scores among these them. The final assessment score would be calculated on the basis of 60 % score from external experts 30% score from internal experts. 

Every year Drona Seeker arranges the DIEF compliance check and sends a team of experts who visits Design Institutes/Schools which participate in DIEF for the compliance check.

No Application Fees are Required. The participating institute only needs to arrange the travel, food, and lodging for the team of external experts (provided by Drona Seeker)for the purpose of a compliance audit.

Acknowledgement: We are thankful to Dr. Anirban Chowdhury (Ph.D. in Design from IIT-Guwahati) for his efforts to voluntarily create the draft of Design Institute Evaluation Framework (DIEF)

Contact: cr@dronaseeker.co.in / cr.dronaseeker@gmail.com

Leave a Reply

Your email address will not be published. Required fields are marked *