The traditional model of engineering education for decades has been that in the first year physics, chemistry, and maths are taught as foundation courses. Then courses on different foundational areas of the discipline and engineering are taught. Only after that a student can try to practice engineering. The overall model has been to teach the foundations in the initial years, and only in final year the students may do full engineering projects in which they may build some systems. (Actually, in most cases, unfortunately even in final years decent engineering projects are not done.)

This model has been under challenge for some time, particularly in the west, as it does not allow students to experience the excitement of engineering, which comes from building useful systems that work, till very late. To address this, many institutions across the world have introduced project-based courses early to provide students some experience of building systems.

In IIIT-Delhi, very early we introduced two courses in the first year whose focus is on “hands on experience”. In the first semester, students do a course called “System Management” in which they work with laptops and mobile phones and their components, and learn what they can do with these machines, how they can manage them well, explore internals by opening them and seeing inside, etc.

In the second semester (by when they have learned programming as well as electronics in their first semester) we introduced an Intro to Engineering Design (IED) course, whose basic goal was to design a working physical system that included hardware and software (so software only projects are not permitted) to solve some problem. In IED the focus is on project – the lectures are to support the projects. So, the lectures provide an overview of the basic components that are widely used in such projects – a cheap but versatile platform like RasPI or Arduino, common sensors for vision, motion, proximity, etc, and some actuators like stepper motors, etc. They also learn a bit about workshop and tools.

Students form teams and start thinking about the project from the start of the semester. Each project team is given a budget to buy the components for their project – this exposes them to the process of buying components and markets, as well as about the basic engineering principle of cost control and delivering the project within budget. The completed projects are then demoed to all in an open house one day at the end of the semester.

This year also I visited the demos and interacted with at least 25 project groups. The course instructor was Alexander Fell, who is himself a fine system builder. I was amazed and highly impressed at the sophistication of the projects students had executed. Many of them were better than the final year projects in many engineering colleges and some of them, with extension and further development, could even be the final year project in IIIT-Delhi or an IIT.

To give a sense of the variety and complexity of projects undertaken, I am giving below a brief description of a few projects (I will keep adding to this list). It is worth remembering that these projects were executed by 2nd semester students (i.e. they have been out of class XII only for a few months), who were doing 4 other courses (at least two of them have their own programming/ lab assignments.)

These type of courses emphasize the fact that engineering is about solving problems of people by building systems and solutions using science, mathematics, and theories. Engineering is clearly not about theoretical understanding only in which problems are only solved on paper and tested in exams, or simple labs with defined experiments that are being repeated by students year after year.

Unfortunately, this is what engineering education in the country has degenerated to – most engineering institutions teach concepts (and that too not too well) with almost no exposure to actual engineering – mostly because the faculty does not have the necessary skills to guide such projects. As a result, we find engineering graduates who don’t have any real engineering or problem solving skills and are therefore not employable. And so a large number of these graduates proceed to do MBA where engineering skills are not important, and only conceptual understanding is needed to solve the problems in entrance tests.

This lacuna in engineering education is also contributing to the immature innovation-led ecosystem in our country to generate businesses offering new products and solutions. It has also led to an underdeveloped engineering industry. Thankfully, one is now seeing some examples of innovation resulting from deep understanding of the problem and technology and delivering solutions that can work to solve problems and scale – these are often led by teams that excel in engineering capabilities. Thankfully also, some leading engineering institutions including some IITs (e.g. IIT Delhi) are introducing project based courses early in their curriculum. These bode well for the future for engineering in the country.

 

Brief Description of Some of the Projects 

  • GardenBot. This bot is essentially a mobile cart with water, mechanical arm, camera, ultrasonic sensor, etc. It moves freely (choses the direction randomly), detects an object and if the object is a plant (done using image recognition library), checks the moisture of the pot, and adds water to the pot plant if the moisture content of the soil is low. It is integrated with the internet to check whether it has rained in the past few days to make a smarter decision for watering. As it moves autonomously, it can water all the pots in a garden – essentially doing the job of a smart gardner.
    • Components. Moisture Sensor, Ultrasonic Sensor, Webcam, five DC motors (four for wheels and one for water pump), One servo motor (for arm), H-Bridge for controlling DC motor
    • Platform and Code. Raspberry Pi, with about 500 LOC of Python.
    • Team. Akshat Singh, Apoorv Khattar, Harshit Chaudhary, Raghav Sood

 

  • SmartMirror. It’s a smart assistant (like siri) which you can put on your wall and it looks like a mirror. It’s powered by a Raspberry PI, and has a monitor with a one-way mirror sheet on it so it looks like a mirror on which things can be superimposed / projected also. User interacts with voice commands to get news, maps, etc., which the mirror intelligently displays by getting the information from internet using API calls.
    • Components: Mic, Camera (presence detection), a flat monitor (with one way mirror sheet posted on it), Speakers; a case was made to hold all components and make the monitor look like a mirror.
    • Platform, Code. Raspberry PI 3B, About 4000 Lines of Python and JavaScript.
    • Team: Peeyush Kushwaha, Madhur Tandon, Mudit Garg, Siddhant Singh

 

  • Faux Arm. A robotic arm that wirelessly mimics the arm movement of the operator. The Faux Arm is a robotic arm with three points of movement, simulating the operator’s elbow joint, wrist joint and two fingers for grabbing and picking things up. We also built the Sensor Sleeve, a sleeve with sensors that can be worn by the operator on his/her arm, serving as a wireless input to the robotic arm.
    • Sensors: ADXL335 x2, accelerometer (to sense the angle of the arm wrt ground);
    • Actuators: MG996R Servo; MG995 Servo; FS90 Servo
    • Microcontroller and code: Arduino Uno (two) with XBee Module (two); Appx  800 Lines of C.
    • Mechanical components used:  Self-designed 3D printed structure of robotic arm; Self-designed aluminum grabber; Elastic, Velcro and a glove for sensor sleeve.
    • Names of the team members: Shivin Dass; Anvit Mangal; Taejas Gupta; Aditya Singh.

 

  • Robotic humanoid hand. In our project we had constructed a robotic humanoid hand. The 3D model of hand was open source and easily available on Inmoov. Our project used 3 types of control functions i.e. glove control using flex sensors for remote control of the robot, voice commands using the voice sensors, and direct muscle controls using the myoware muscle sensors. This hand can be used by amputees and physically challenged (using muscle sensor or voice control), for exploring inhospitable areas (by glove or voice control), etc.
    • Sensors: Myoware muscle sensor V3; Electrohouse Voice recognition sensor; Flex sensors (4×5” and 1×2.5”)
    • Actuators: 5 x mg995 towerpro servo motor.
    • Mechanical Components: 3D – printed human hand and its assembly (we printed it).
    • Platform and Code: Arduino; About 300 lines of C code; open source libraries for Voice recognition module.
    • Team: Shreedhar Govil, Siddharth Dhawan, Tanish Gupta, Vishal Singh Rajput

 

  • ShadowBot. Despite the technology today, large parts of the world remain inaccessible due to the inability of the humans to survive in harsh conditions. This can be changed by using robots. However, AI is not yet developed enough to allow robots to react accurately in delicate situations. Our project aims to improve the ability of a human to control a robot, by allowing it to mimic the user’s actions! Project Demo on YouTube.
    • Sensors: Microsoft Kinect v1.8
    • Actuators: S3003 Futaba Servos (ten for different joints and degrees of freedom);
    • Mechanical Components: Oblique servo brackets; Long U-shaped servo brackets; Short U-shaped servo brackets; L clamps; Nuts and bolts
    • Power Source: Turnigy 2200mAh Lipo Pack
    • Microcontroller: Arduino Mega with HC-05 Bluetooth Module; About 400 Lines of C# code for Kinect, and 150 Lines of C for Arduino.
    • Team: Aditya Chetan, Anant Sharma, Shwetank Shrey, Siddharth Yadav (mentored by PhD student Manoj Gulati)

 

  • Ambhibian BOT:   It is a remotely controlled (through the Ardiuno RC controller, configured for Bluetooth) amphibian robot which has the capability to travel through varied tough terrains, including water bodies (antenna and camera remain outside the water), to provide video feed. It comes with an emergency propeller which can be used in case the directional motors fail. Entire functionality is controlled via Bluetooth connectivity, and a live video feed is given by the camera attached at level height of the robot to the phone.
    • Sensors: Night vision camera, HC05 – Bluetooth chip for Arduino.
      Actuators: Geared DC motors (300 rpm, Quantity-5 (4-wheels + 1-propeller)), Lithium ion batteries (Quantity-2, each battery-3V), L298N motor driver (Quantity-2).
    • Platform and Code: Arduino mega, 50 lines of C code.
    • Mechanical components: 7.5 cm diameter multi-terrain tyres, light weight plastic box, M-seal and hot-glue (insulation purposes)
    • Team: Ashutosh Sharma, Arshan Zaman, Yash Tomar, Vineet Kumar Rana.

 

  • DrawBot. An automated arm that drew pictures given to it with a pen on a paper. The input was an image file. From the grey scale image of the file, we extracted edges and lines (using Sobel edge detection algorithm) in the picture, and then drew these lines using the DrawBot arm. For drawing, movement was controlled by two stepper motors. The Drawbot worked by making use of the nearest salesman algorithm that moved the arm in the direction of nearest pixel, by drawing small segments of lines using the slope and coordinates.
    • Components: Stepper Motor, Voltage Level Shifter, Gear Belts, Channels (to make arms), H-Bridge
    • Platform and Code: Ras Pi 3B, about 200 lines of Python
    • Team: Simran Deol, Navneet Anand Shah, Aditya Tanwar, Naman Kumar

 

  • iDabba. Our​ ​ project,​ ​ named​ ​ “iDabba”​ ​ is​ ​ a​ ​ smart​ ​ container​ ​ which​ ​ identifies​ ​ what​ fruit / vegetable / item​ ​ is kept​ ​ in​ ​ it​ ​ (using computer vision techniques; the​ ​ item​ ​ has​ ​ to​ ​ be​ ​ one​ ​ of​ ​ those​ ​ trained​ ​earlier), ​the​ ​ temperature and humidity​ ​ of​ ​ the​ ​ box,​ ​ and​ ​ the​ ​ weight of the items. All​ ​ this​ ​ information​ ​ is​ ​ visible​ ​ to​ ​ the​ ​ user​ ​ via​ ​ a web​ ​ app. ​ We​ ​ were​ ​ motivated​ ​ to​ ​ design this​ ​ ​ ​ to​ ​ ​ ​ solve​ ​ every day​ ​ hassles​ ​ in​ ​ kitchens​ ​ and​ ​ households​ ​ regarding​ ​ spoilage​ ​ and infestation.​ ​ It​ ​ can​ ​ ​ ​ be​ ​ scaled​ ​to​ ​ meet​ the​ needs​ ​ of​ ​ farmers​ ​ and​ ​ storage companies​ ​ for​ ​ smart​ ​ storing​ ​ options​ ​ and​ ​ act​ ​ as​ ​ a​ ​ small-scale​ ​ sil,. It can be enhanced to add the age of the items kept – then more intelligent decisions can be taken.
    • Sensors​ : Humidity​ ​ Sensor (DHT11​), Temperature​ ​ Sensor DS18B20, ​ Load​ ​ Cell to measure weight, HX711​ ​ ADC​ ​ Module ​ ​ to convert; ​ ​ Wifi​ ​ Module ESP8266​;
    • Platform and code: Arduino​ ​ Duemilanove, Raspberry​ ​ Pi​ ​ 3 (for computer vision); 180 lines of C code for Arduino; Appx 250 Lines of Python​ ​with​ ​ Open CV​, Microsoft​ ​ Vision​ ​ API ​ and​ ​ Flask​ ​ for​ ​ backend​. Front End​ ​ using​ ​ HTML/Javascrip ​ -​ ​ approx​ ​ 150​ ​ lines
    • Team. Viresh​ ​ Gupta​, Brihi​ ​ Joshi, Zoha​ ​ Hamid, Shravika​ ​ Mittal​.

 

  • SmartCart: We made an automated cart which follows the user based on a tag which the user wears. When placing the product in the cart, the product’s barcode is scanned and the bill prepared automatically in the app on the mobile phone. By using such a cart, a store owners can reduce their manpower for checkout, and also reduce the waiting times for customers.
    • Components: Ultrasonic Transmitters and Receivers (made the circuit for using these), 2x 12V DC Motors, 2100mAh Lipo Battery, Wheels.
    • Platform and Code: Arduino, Android phone. About 90 lines of C code, and about 640 lines of Java code (for the App), about 200 LOC of PHP on the server (mimicking the inventory of the store).
    • Team: Aakash Sehrawat, Anmol Prasad, Nilay Sanghvi, Saksham Vohra

 

  • Plant Watering System. This project provides water (which may contain other essential nutrients) to multiple plants based on their respective moisture sensor readings. The frequency to check the moisture reading depends on the temperature and humidity readings given by the temperature sensor. A GSM module timely informs the owner through SMS about the water level of the tank, and when the plants are watered. (A few teams did project of this type).
    • Sensors. YL 69 Soil Moisture Sensor; DHT11 Temperature and humidity Sensor
    • Actuators . Micro (3-6V) Submersible Pumps
    • Mechanical Components.  Piping system to water the plants.
    • Platform and code. Arduino, Appx 200 Lines of C code
    • Team1: Raghav Bhatia, Jai Mahajan, Kanha Srivastav, Shashank Kataria
    • Team2: Ashish Kanojia Dilnawaz Ashraf Dushyant Jangra Rishin Lal
Advertisements