My name is Levi Todes, I was born and raised in Cape Town, South Africa. I grew up, as do many Cape Townians, in love with the outdoors and more often than not you can find me enjoying the Mountains and Oceans that Cape Town provides!
I did my undergraduate in Mechatronics Engineering at the University of Cape Town before heading to Chicago, USA to complete my Masters in Robotics Engineering at Northwestern University. I then spent some time working in a Mechatronics Design role that taught me a lot about electro-mechanical design of robots as a product. This role spanned embedded software engineering and design, mechanical design as well as basic electrical design.
Looking Forward!
I am now back home in Cape Town looking for interesting roles that will challenge me and allow me to learn new skills. I am particularly looking for roles involving Embedded Software as I believe I have both a lot to contribute as well as learn in that area!
Projects
Please see some of my projects worked on during my years of study in the Portfolio section above!
Education
Bachelor of Science in Engineering Mechatronics - University of Cape Town (Graduated Dec 2017.)
Master of Science in Robotics - Northwestern University (Graduated Dec 2019.)
The initial goal of this project was to observe the manufacturing process of the wrench in question and identify a bottleneck in the process that can be addressed by automation. With that goal set out, there were no set requirements on what should be automated and how it should be automated. This allowed for freedom and trust in following the design process to achieve the optimal solution to this problem.
After journey mapping the manufacturing process, giving time values for each task as well as automatibility ratings and assessments regarding what would be gained from automation, it was clear to see that automating the set of tasks involved in placing the wrench jaws into the wrench would result in the most gain to the entire process. As such the task was set out to use/design whatever is deemed appropriate to automate this process in a way that is both accurate and fast. The resulting design is shown below:
As will be detailed, the mechanical design for this project was primarily done using OnShape CAD software, the construction of the rig was done using various prototyping methods such as 3D printing and Laser Cutting; a PIC32MX795F512H microcontroller was used to control the system and the relevant peripheral circuitry; and coding of the microcontroller was done in C.
Approach
As mentioned above, the first step of this project was just to observe. Professor Brown gave me videos of the current production process for the wrench and I began to make note of each task being undertaken, how long each task was taking and how that task affects the other tasks. It was clear that the biggest bottleneck in the process was the placing of the wrench jaws in the wrench. This task represented the biggest gain from automation but also perhaps one of the most challenging steps to automate given the precise needs for placing the jaw as well as the small amount of space in the middle of the wrench in which a mechanism can actuate.
So begun the iterative process of design. The bulk of work done in this project was in mechanical design, each week brought about a new iteration of the current design as well as thoughts on a new approach. That journey can be mapped by following the different "STL" files of the various designs that were cycled through before arriving at the final stop.
It became important to create journey maps of best case scenarios, so as to assess the approach and figure out what could be done in time as well as what could be done better.
Perhaps the most crucial change in the approach came about when it was decided that using gravity to place the jaws could simplify the process by eliminating the need to actuate within the small space in the middle of the wrench. This came about as a serendipitous thought during a brainstorming session with Professor Brown. As such, the final approach was to use gravity to place the jaws in the wrench, with the assistance of an electromagnet and to rotate the wrench around the jig and repeat - so as to place all 6 jaws.
The full design journey is detailed below!
Mechanical Design
The mechanical design involved in this project was by far the most important, most complex and thus the most time consuming aspect of the project. As is the nature of design, it involved a lot of iterations as well as a few changes in approach that ultimately led to the final design. This final set up is shown below:
Initial Designs
The two initial goals of the design, deemed to be the most important in order to achieve the end goal, were to hold the wrench in place and then subsequently to be able to place a jaw in the wrench in a manner that could be repeated 6 times - for each jaw. This led to the first iterations of the design worth mentioning.
Outer Jig - Wrench Holder
It became clear that a way in which 6 jaws could be placed in a repetitive manner would be to hold the wrench in place around the center, wherein the jaw actuation would take place, and then rotate the wrench around this actuator. This led to the following design. The indentations around the circle are in there to allow for a jaw to be placed in any of those six spots. The raised bumps on either side of the circle are the to help fix the wrench in place.
This design was clearly not very good as it was not dimensioned well to hold the wrench at all. This design evolved into something that was built off of the wrench's actual dimensions. It included a bit more support for the bottom of the wrench.
This design finally became dimensioned to precisely hold the wrench open to a certain angle - so as to allow for the jaws to be placed within the wrench. It further includes some protection against the wrench falling out of the jig since the final design requires for the jig and wrench combination to be held at a 90 degree angle.
Inner Jig - Jaw Placement
The jaw placement designs tell a story of the two key changes of approach that completely revolutionised how the whole project would look. The first design for the inner jig was based off of the idea that the jaws would be fed into the system from above in a uniform manner. This meant that the jaws would have to be shifted laterally in a translational manner so as to place the jaw in the wrench.
As can be seen in the above design, the jaw was supposed to be caught by the "jaw placer" in the middle of the jig. This would then be actuated laterally - perhaps with a solenoid - from below so as to place the jaw. The problem encountered with this design was that the reliability of the jaw falling in the correct orientation was low as well as the lack of physical space in which to actuate the "jaw placer". This led to the next idea which was to feed the jaws into the system from the bottom.
This design was based off of the same principle of using a "jaw placer" in the center of the jig to place the jaws. But this time the jaws being fed into the jig from the bottom would push against the ramp in the housing in the middle of the jig which would in turn force the jaw out of the opening and onto the "jaw placer". Subsequently, the "jaw placer" would then be actuated laterally from below so as to place the jaw.
This again saw similar problems as the "feed from the top" design as the jaw still had to fall into place and the room for actuation was not improved.
While playing around with the 3D printed models of the design, it was noticed that holding the center jig at a 90 degree angle to the wrench allowed for the jaws in the system to simply fall into place! This revelation led to the final approach used in the project.
Final Designs - Gravity is my Friend!
Inner Jig
The final design idea is to have the jaws being fed laterally into the inner jig. Once in the inner jig, these jaws can then fall into the wrench if the outerjig holds the wrench in the correct spot. Below is the design for the inner jig, along with a feeding tube used to get the jaws into the jig:
With this design, some key objectives in order to make it useful are to propel the jaws along the feeding tube into the inner jig, control how and when the jaws are dropped into the wrench and to keep the jaws in the correct orientation so that they can fall into place. A simple solution to solve all three of these objectives is to use an electromagnet placed at the head of the inner jig as well as a steel bar placed above the head of the inner jig that extends to the feeding tube. This bar will act as a magnet when the electromagnet is on and as such will keep the jaws oriented correctly. The steel bar can be seen in the image above. The placement of the electromagnet can be seen in the full assemnbly of the final design.
Rig
In order to have this design held at 90 degrees, a rig was designed. This rig was designed with two main objectives, it had to be able to support the weight of all the necessary components of the design and it had to allow for the movement necessary for the design. It can be seen below:
This rig is made from 3mm thick acryllic sheets and was laser cut and assembled from 6 parts. The drawings of these cuts can be found in `.dxf` format in this directory of the Github repository. The rig has holes in the front piece, these holes are there to allow for the necessary components of the design to be screwed into the correct location.
Gears
With the rig and inner jig in place, in order to rotate the outer jig holding the wrench around the inner jig - in order to place 6 jaws, it became clear that a gear system driven by a motor would be necessary. The gears were designed such that the driver gear could be controlled by a stepper motor and the driven gear can hold and rotate the outer jig. The gears were designed with a teeth ratio of 35/15. This was chosen to allow for easy rotation in minimal space, allowing the driver gear to be small but not require a very strong motor. The driven gear was 3D printed so as to allow for it to be connected to the stepper motor using a sprocket with a set screw. This design is shown below, including the sprocket used:
The driven gear was laser cut and designed with a hole in the middle of it, in the shape of the outer jig. The outer jig was then joined to the gear making it one part capable of rotating around the inner jig while holding onto the wrench. The driven gear with the outer jig holding a wrench can be seen below:
Clamps
Since the driven gear has to hold the wrench, this part - with the wrench in place, is heavy and could easily fall of the front of the rig. With this in mind, magnetic clamps were designed to hold the driven gear flush to the face of the rig while still allowing for full rotation of the gear. These are designed to hold magnets on either side of the rig. This allows for the driven gear and the wrench to be easily removed for a new wrench to be placed into the system, yet it allows for the gear to rotate freely and mesh properly with the driver gear. The design is shown below, the cylindrical holes are dimensioned to hold magnets firmly in place:
Actuators
In order to actuate the design as described, the electromagnet and stepper motor being used will need to be held in place. The motor was held in place, with its shaft, at the necessary distance using a spacer screwed into both the front of the rig as well as the actual motor. This spacer was 3D printed. Further, the electromagnet needs to be held at the head of the inner jig. This holder was made of two 3D printed parts, elecmag_holder1 and elecmag_holder2. This was designed to be able to hold the heavy electromagnet in place while allowing for the full rotation of the wrench.
Electronics
The electronics of this system can be broken down into 3 seperate sections, namely the microntroller and the surrounding circuitry, the electromagnet driver and the stepper motor driver.
Microcontroller
The brains of the project allowing for the electromagnet to be triggered at the right time and controlling the rotation and timing of the stepper motor is a PIC32MX250F128B microcontoller. The breakout schematic of this microcontroller, necessary for programming as well as interaction at a 3.3V logic level, is shown below. It includes a 3.3V voltage regulator, a Pololu breakout board of a micro-usb connector as well as a MPLAB Snap for programming.
Electromagnet Driver
The circuitry that is used to drive the electromagnet consists of a GPIO pin from the microcontroller going through a 1 kilo Ohm resistor to the base of an NPN Darlington transistor. The collector of which is connected to a 12V power supply through a parallel combination of the electromagnet being activated and a flyback diode. The emmitter of the transistor is connected to ground. This allows the GPIO pin to be used as a simple on/off switch for the electromagnet.
Stepper Motor Driver
The circuitry used to control the stepper motor is the Pololu MD20b chip. This chip is a breakout board for the DRV8825 which has two H-bridge drivers and a microstepping indexer, along with some protective circuitry. This chip allows the stepper to be simply controlled by 3 GPIO pins to control the step resolution (M0,M1,M2) and one pin set to output a PWM signal (STEP). This chip also allows for 3.3V logic to be used from the microcontroller while supplying the motor with the 12V necessary to use it. As can be seen in the code, pins M0, M1 and M2 are used to set the resolution at which the motor is controlled. Setting M0 and M1 high and M2 low sets the driver to control the motor in 1/8 step mode, meaning the 200-step-per-revolution motor now has 1600 microsteps per revolution. A PWM signal is sent to the STEP pin to drive the motor.
Code
The code for this project was done in C in the MPLAB X IDE. This code is written for and loaded onto a PIC32MX250F128B microcontoller.
The purpose of this code is to control how the electromagnet and stepper motor get triggered and rotated respectively. The code controls GPIO pins on the microcontroller to be set high and low so as to rotate the stepper motor the necessary amount at the correct time and turn the electromagnet on and off at the correct times. A user button is used to step through each state, these states being electromagnet off, electromagnet on and rotate stepper motor. This allows for easy control of the system. This code can be easily changed to not rely on the button, in order to run the process without human intervention.
Future Work
Further work on this project could include:
The mechanical and electrical design of an automated rivet pin placer. With this complete, the whole system can run independant of human interaction. I think a 'bullet magazine' type mechanism actuated with a solenoid could be used to achieve this.
The design and printing of a dedicated PCB for the system as opposed to leaving everything on a breadboard. With the schematics already developed, this could be done quickly and easily.
Integration with the rest of the manufacturing process as well as automation of the rest of the manufacturing process presents a great new challenge to tackle, one with great reward.
Further Reading
For more detailed descriptions of how this all came together, please visit the project's Github repository found here.
This Project was to design, build, program and eventually play a Human-Robot Ukulele Player or H-RUP! The inspiration for this project came from a love of music, a will to learn the Ukulele and lack of being able to do so, so why not build a robot to help!
This Robot is designed to help someone play a song on a Ukulele without pressing any chords, H-RUP presses the chords for you and has an LED giving you a 3 flash count down and telling you when to strum!
Approach
The basic approach I followed was to develop a 4x4 bank of solenoids to be housed above a small section of the ukkulele neck, placed above a part of the ukulele whereby this bank of solenoids will be able to press a pretty comprehensive combination of notes/chords, enough to be able to play most, if not all songs. These solenoids are controlled using a PIC32MX795F512H microcontroller and a driver circuit. The Ukulele and Solenoid housing were designed and constructed by 3D printing and Laser Cutting. Coding was done C and Python.
Mechanical Design
The mechanical design of this project can be boiled down to 4 main objectives, it had to securely house the Ukulele, it had to place the solenoid bank in the correct position so as to be able to hit the right chords, it had to be stable and it had to be able to handle any heat dissapation from the solenoids. The final result of these objectives can be seen in both ".stl" format and in real life below:
The design can be further broken down into 3D printed parts and Laser Cut Parts. The 3D printed parts were responsible for housing the Ukulele as well as providing stability to the design. As such a wide based structure shaped around the Ukulele was designed. This design is shown the stl file below:
The laser cut parts were responsible for housing the solenoids above the right part of the Ukulele such that they are able to press chords. Three layers were designed such that the solenoids would be compressed from the bottom and top as well as aligned in the middle. This three layer design was chosen so that the solenoids can receive airflow to prevent overheating. Each layer fits around the threadbars as can be seen in the full assembly above and the layers are seperated by nuts on the threadbars. The three layers shown from left to right are a birds eye view of the top, middle and bottom layer respectively.
Electronics
MicroController
The microcontroller being used to control this project is the PIC32MX795F512H with use of the NU32 breakout board.
Circuitry
The circuitry that is used to drive each solenoid consists of a Digital I/O pin from the microcontroller going through a 1 kilo Ohm resistor to the base of an NPN Darlington transistor. The collector of which is connected to a 7.5V power supply (32W, 4.32A AC/DC wall adapter) through a parallel combination of the solenoid being activated and a flyback diode. The emmitter of the transistor is connected to ground. The datasheets for the components used can be found here. A simple connection from the PIC32 Digital I/O to an LED with a 330 Ohm pulldown resistor. This LED is used as the Start and Strum Light to indicate to the user when to strum. Both circuit designs shown below, from left to right is the solenoid driver and LED circuit:
Code
The Code for this project was split into microcontroller code - done in C - and user code - done in Python. The purpose of the Microcontroller Code is to read a song sent into the microcontroller by the user code and convert that into digital highs or lows such that the solenoids actuate at the correct time. This allows the H-RUP to press chords on the Ukulele in the correct sequence and timing, producing a song. The User Code is a python executable called Ukelele Jukebox, it gives the user a choice, the user can either write their own song to be played with H-RUP or the user can choose from a list of preset songs and play that.
Future Work
Further work on this project could include:
Redesigning the mechanical structure a bit to make it smaller, perhaps small enough to be able to hold it like a normal Ukulele.
The design and printing of a dedicated PCB for the H-RUP as opposed to leaving everything on a breadboard.
Adding a scoring system, that listens to the user's strumming and compares it to the timing of the Strum LED.
Adding more songs to the repertoire
Further Reading
For more detailed descriptions of how this all came together, please visit the project's Github repository found here.
This Mario Kart project was undertaken with the following two main objectives in mind from the start:
The robot must follow a line mapped out on the floor
The robot must be unique and look great!
All Mechanical Designs and Code for this project can be found here.
Design Constraints
The following were the contsraints given for the design of the robot:
The robot must be wheeled
Line sensing must be done with the provided USB camera.
Only two servo motors may be used to propel the robot.
Control of the robot will be done with a PIC32 microcontroller.
Hardware
All mechanical design of this robot was done using Onshape. - a browser based CAD software. Electronic and PCB design was done with Eagle.
Mechanical
All CAD designs of all the parts necessary to construct this robot are included here. But the main components are described a little below.
Body, Wheels and Camera Mount
The body shown below is made out of laser cut plywood. The front fender is 3d printed PLA plastic. The body was designed to house the electronics, battery, servo motors and LCD screen in such a way that keeps the weight of the vehicle in balance.
The wheels were designed to look like Mario Kart wheels with a big M in the center. These were 3d printed out of PLA plastic.
The camera mount was made out of laser cut plywood with a 3d printed PLA plastic Mario badge in its center. This was a necessary feature as functional holder of the camera as well as a style feature as it fits well within the Mario Kart theme.
Koopa Cannons
The next feature added purely for a little bit of style and fun were Cannons that shoot little Koopa Shells out of the back of the robot - like in Mario Kart. The cannons and Koopa shells both were 3d printed with PLA plastic.
Final
The final CAD design of the Mario Kart can be seen in its entirety below.
Electrical
As mentioned before, the control of this robot was done using the PIC32microcontroller. In order to interact with the microcontroller and the peripherals necessary to operate the robot (camera, lcd, motors), a PCB was developed. Below is an upclose view of the PCB. The motors were driven with PWM signals through an H-bridge. The LCD communication protocol used was SPI. The communication protocol used to speak to the camera was I2C. Below is a broader image of the pcb and the connected peripherals. A further circuit, seen in the above images, was necessary to drive the solenoids used in the Koopa shell cannons. This solenoid driver circuit is shown below.
Software
All the code used in order to control the robot was C code used to program the PIC32 microcontroller. The code necessary was code such that the microcontroller could communicate with the camera and recieve line following information from it. This information was then displayed on the LCD screen for testing. Using this information, a control loop was developed such that the duty cycle of the PWM signals sent through the h-bridge to the motors varied according where the black line was in relation to the camera. This allowed for the mario kart to follow the line. The triggering of the solenoid based Koopa shell cannons was set to go off at a random time after the robot was moving. All the code used can be found here.
This project was to build and code an intelligent motor controller. The structure of the controller is such that it receives inputs from the client using a MATLAB user interface, which in return receives the results of the motor which is then plotted and displayed to the client. The system is able to follow a reference trajectory, velocity or torque by spinning a brushed DC motor's shaft with an inertial load attached. This is achieved using two feedback control loops, implemented as a digital PID controllers which use values obtained from the encoder built in to the motor as well as from a current sensor.
Video
Below is a video showcasing the DC Motor being controlled:
Software
All of the software can be be found here. The two parts of this project are split into the DC motor control and the client interface.
C code was used to program the PIC32MX795F512H microcontroller with an NU32 breakout board.
MATLAB code was used tot develop the client interface.
The interface looks like this :
The above options presented to the client represent the capabilities of the controller.
Hardware
The hardware used in this project include:
Brushed DC Motor with a plastic bar attached to the shaft as an inertial load.
An Encoder attached to the motor.
A PIC32MX795F512H microcontroller with an NU32 breakout board.
A dsPIC33FJ64MC802 micontroller used with a breakout board, programmed as a decoder.
A breakout board for the DRV8835 H-Bridge.
A breakout board for the MAX9918 current-sense amplifier.
An nScope digital oscilloscope was used for debugging.
The circuitry was connected as follows:
Result
The image below shows how the motor was able to track a cubic trajectory with an average error of only 1.1 degrees.
For my final project for an Embedded Systems course I took, focusing on ROS, my group and I used a Rethink Robotics’ Sawyer to tune a Ukulele.
Explanation and Strategy
The goal of this project was to have a Rethink Robotics Sawyer robot autonomously tune a ukulele. The ukulele, a custom pick, and a custom tuning peg were all placed in pre-designated locations on a platform along with an alvar tag. Using the built-in head camera, Sawyer sensed the alvar tag and knew the relative locations of each of the aforemetnioned tools, it tuned the ukulele. The tuning process can be broken down into the following pattern:
Pick up the pick
Pluck the designated ukulele string
Listen to the produced pitch and find error from expected pitch
Set down the pick
Pick up the tuning block
Move to the designated tuning peg on the ukulele
Turn the peg by an ammount proportional to the pitch error
Set down the tuning block
Repeat steps 1-8 until pitch error is below a specified tolerance
Repeat steps 1-9 until on each of the 4 ukulele strings
After completing all of these steps, the expectation was to have a tuned ukulele, fit for use by the great man himself, Israel Kamakawiwoʻole.
Result
We were successful in getting Sawyer to tune 1 string of the Ukulele but unfortunately did not have time to calibrate the system for the rest of the strings. The video below demonstrates this, tuning the Ukulele’s A string to an A4 note, within 15 cents.
Video
Code
The code used for this project, as well as further documentation on the project can be found in this github repository.
As part of a class I took on Geospatial Vision and Visualisation, me and my project team undertook a project focused on object detection from point cloud data. Specifically, we were looking to isolate light poles from point cloud data taken from a lidar of a general "street scene". More information about this project can found on this github page in great detail including, code, point cloud data (both raw and refined), results as well as a presentation on our approach to the problem.
Methodology
The open source Python library called Open3D was used to create point cloud objects as well as implementing segmentation and filtering.
The raw point cloud data was converted from latitude-longitude-altitude (LLA)coordinates to cartesian (XYZ) coordinates.
The data was then filtered to find the pole. Explained more under "Algorithms" below.
The Python library Sklearn was used for the machine learning techniques applied.
Algorithms
Data Preprocessing
As mentioned above, the first step of preprocessing is to have the data converted from LLA coordinates to XYZ coordinates. This allows for the data to be processed as and viewed as a 3D point cloud:
Next, the data is downsampled so as to reduce the amount of processing required. This is done by first applying a voxel grid filter which takes a spatial average of the data points within the given sized voxel (3D box). After that, a further filter is applied by removing the statiscal outliers left in the point cloud. This reduced the number of points to process from 1292208 to 39630 while the data remains accurate. The downsampled cloud is shown below:
Finding the poles
With the downsampled data, the next step is to distinguish which points belong to a pole and those that do not. A technique called PLANAR FILTERING was used. This process can be summarized as follows, points in the cloud are snapped to a grid. For each set of points that lie on a vertex of two dimensions - in this case the X-Y plane - the variation in the third dimension - Z axis - could be used to filter data points. This way points that vary in the z axis (above a certain height threshold) can be used to single out the poles.
The next step necessary to isolate the light poles is CLUSTERING. The initial clustering was done using k-means clustering. This clustering was done based on x and y coordinates. This k-means clustering is accompanied by a method known as the elbow method. This was used to determine the number of clusters (poles) in the cloud.
Result
The picture below shows the resulting image after all the processing is done. It clearly only shows the light poles in the cloud.
For my final project for a Machine Dynamics course I took, I mathematically modeled and simulated the basic dynamincs the plastic impacts involved in the system. The system was setup such that a mass will fall on one end of the seesaw and this plastic impact will prescribe the resulting trajectory of the mass on the other end of the seesaw. This simulation was done in Mathematica using Euler-Lagrange principles.
Simulation Video
Code
The Wolfram Mathematica code used to develop this simuation can be found here
Code Explained
The first part of the code defines a few useful functions that are used later in the program.
The next part of the code sets up all of the transformation matrices - with reference to a ground frame - of all of the necessary points used to model the dynamics.
The next section defines a few parameters that can be adjusted such as the mass of the seesaw and the 'weights', the spring coefficients, the gravitational acceleration, etc.
The code then uses these parameters and transformation matrices to setup the Lagrangian equation used to describe the system dynamics. i.e. the difference in kinetic and potential energies of the system.
The pre-impact conditions and constraints are set up such that the system follows its expected pre-impact behaviour. Next the impacts are evaluated. This section defines the impact update laws and evaluates the plastic collisions that occur bewtween the masses and the seesaw while maintaining constraints such as the springs still being attached to the seesaw and ground.
The evaluation of these laws and constraints then give the post-impact intial conditions and behaviour which is then simulated.
The last section of the code is used to plot the results of this simulation and then animate those results. This animation is shown above.
In todays day and age, everyone claims to know where to find the best wine! But who can you trust? Well, we decided to explore whether we could use a machine learning classification approach to distinguish a good wine from a bad one!
This project set out to use an established dataset of red and white wines (6497 wines to be exact) and create a model to predict a wine's quality. Each wine has 11 features from volatile acidity to alcohol content as well as a quality rating out of 10. Our approach detailed below was to treat this as a classification problem whereby wines in the range of 0-4 are bad, 5-6 are good and 7-10 are great! More detail, including the code used as well as the credit for the wine dataset used is available here.
Approach
As mentioned above, the approach we took was to treat this as a classification problem. This means we had to assign labels to the data of bad (0), good (1) or great (2) to each wine. The first step was reading, understanding and preprocessing the data to be ready for modeling.
Data Handling
In order to handle the data available, we used the pandas library. This allowed us to easily read and process each feature of each wine. We added a new column to the data to assign the labels described above.
K Fold Cross Validation
The next preprocessing step is to split the data into k equal data sets of folds. In this case we used k =3. This allows us to use 1 fold as a test set while the rest are used to train the data. This is done multiple times such that each fold is used as a test set. This is a great way to ensure accuracy of assessment of whatever model is used.
Modelling
Using the sklearn python library we were able to use a few model techniques that we thought were worth trying and assess which performed the best. We used and tested the following models:
Gaussian Model
Logistic Regression
Decision Tree Classification
Random Forest Classification
K Nearest Neighbours Classification
Multilayer Perceptron Neural Network
Conclusion
We found that the best performing model turned out to be the RandomForest Classifier. Simpler models such as the GaussianNB and the Kneighbours did not perform so well. This was something we were unsurprised by due to the complexity of the data and difficulty in finding clear predictive features from visual inspection of the data. More complex models such as the RandomForest which performs implicit feature selection is able to better capture the data. We also found that some features were helpful to the model and some were not at all. For example, alchohol content was a feature that linearly correlated to quality but density and ph value were entirely unrelated to quality.
Video
This project is further described by my project team-mate in the following video
For my final thesis of my undergradute degree (B.Sc Mechatronics), I set out to develop a computer vision software system investigate different methods with the eventual goal of isolating an object of interest undergoing motion from a scene that exhibits background motion.
Specifications
The development of the system was done:
Using C++ programming language.
Using the open source computer vision library, OpenCV.
Within the frameworks of the CodeBlocks IDE.
On a frame-by-frame basis, each frame had the necessary image processing and vision techniques applied to it.
System Development
The code developed and investigated for this project can be split into three main tasks:
Preprocessing: These are the image processing techniques applied before any background subtraction was applied, such as blurring and edge detection.
Background Subtraction: These are statistically based pixel level algorithms the seperate each frame into foreground objects and background objects.
Object of Interest Identification: These are methods for matching a template image of a predetermined object of interest to any foreground objects found in the binary image produced by the background subtraction stage.
Results and Conclusions
The images below show the two best performing combination of methods. In both images, the top left block shows the raw frame, the top right block shows the binary image produced by background subtraction, the bottom left block shows the process of the template matching method being used and the bottom right block shows the raw frame with a bounding box surrounding the object of interest.
These are screenshots of the contour matching (left) and feature homography (right) methods being used for object of interest identification.
After running numerous tests - with a testing environment set up to reward accuracy, performance and false positive/negative response - the final best performing system used Gaussian blurring and Sobel edge detection for preprocessing, a mixture of Gaussians method for background subtraction and a method that finds and matches the maximum length contour in the template image and the current frame (contour matching) for object of interest identification.
Video
This video shows the best performing combination of methods in action: TO BE ADDED SOON
Further Reading
If you would like access to the full thesis or would like to find out more about it, please don't hesitate to contact me.