top of page

Knock Knock

Machine Learning / UI & UX / Agriculture

Final_Presentation_knock_knock (1).jpg

Challenge

Solution

Team

How Might We help the food industry detect harmful foods?

 

Do you remember when your grandma knocked on watermelons to see if they were ripe? The subtle art  of picking out fresh produce is a dying art.

 

 

Exploring sound diagnostics through everyday experience, Knock Knock is a mobile application concept that uses machine learning algorithms to discover the ripeness of fruits. 

 

Interdisciplinary team of designers & engineers. Hanson Cheng, Bahareh Saboktakin Rizi, Fay Feng, and Hugo Richardson.

App Flow

Knock_Knock_AI_UI.jpg

Machine Learning

KnockKnockJPEGs-18.jpg

Image & Sound Data

Using image and sound data we would be able to determine the ripeness of a fruit. 

Final_Presentation_knock_knock.jpg

Image Data

Fruits are classified by color, shape and size.  We created an initial data set of labeled image data to then train the sound machine learning algorithm. We validated our idea by training a convolutional neural network model for real time object detection using Tensorflow.

Sound Data

We gathered a library of audio files from knocking on objects.

 

Using the API The Fast Fourier Transform (FFT) converts knocks of the collected audio files from time to the frequency domain.

 

This shows a promising pattern that could be used for further classification.

Final_Presentation_knock_knock (2).jpg

Potential Software Flow

Final_Presentation_knock_knock.png
bottom of page