Booth Id:
SOFT056
Category:
Systems Software
Year:
2017
Finalist Names:
Deshmukh, Neil
Abstract:
Diagnosing plant and skin diseases automatically from images is a challenging task. Specifically, smartphone images of plants and skin exhibit variability in numerous factors such as zoom, angle, and lighting, which makes classification difficult. Convolutional Neural Networks (CNNs) have developed into a powerful tool in the field of machine learning, one that has grown to become extremely promising at image classification.
The purpose of this project is to be able to diagnose plant and skin diseases automatically with an accuracy comparable to that offered by modern dermatologists and plant pathologists. For this, I developed an iOS application to detect skin and plant diseases using two specially developed CNNs. I developed these CNNs, which were then trained using more than 500,000 dermoscopic and plant pathological images to make the classification networks robust to image variability. I trained the CNNs on datasets of images consisting of 13 human and 17 plant diseases.
Outfitted on an iOS device, the application is capable of classifying with 83.4% (±0.01) accuracy for skin diseases and 85.7% (± 0.026) accuracy for plant diseases. The testing was done on images taken with a low-quality camera, with imperfect lighting and angle. To use, all the user has to do is aim the smartphone camera towards the potentially diseased area and the app will provide a real-time diagnosis to the user by classifying the image using the CNNs.
The CNNs achieve performance far above any other tested system, and its efficiency and ease of use will prove it to be a helpful tool for people around the world. There are more than 3.4 billion smartphone subscriptions worldwide; therefore, the application could potentially provide low-cost universal access to vital diagnostics.
Awards Won:
Second Award of $2,000