

A data-driven framework for integrating visual inspection into injection moulding pipeline
Recent advances in machine vision has led to new opportunities for automating that entire manufacturing pipeline. Consider, for example, the situation where an unattended computer vision system inspects the widget and decides whether or not to discard it. Even this little amount of automation can save many hundreds of person-hours on a typical factory floor. While for simple designs, we now have automated inspection methods relying upon lasers, 3D scanning or other imaging modalities that can decide if a widget has any defect. For complex designs, this ability remains elusive. More importantly, however, automated inspection schemes can only decide if a widget deviates from its intended design, say available in the form of a CAD drawing, it cannot decide what changes should be made down the manufacturing pipeline to prevent similar defects in the future. This project aims to explore machine learning techniques that integrate automated inspection with manufacturing process. Specifically, we will focus on injection moulding process in this project. We will develop new theory and methods for characterizing the injection moulding process in terms of quantities measured via an automated inspection system. This effort will lead to a deeper understanding of the role of myriad of parameters that control injection moulding processes.
Industry Partner(s): Axiom Group
Academic Institution: Ontario Tech University
Academic Researcher: Qureshi, Faisal
Focus Areas: Advanced Manufacturing, AI


Automatic Identification of Phytoplankton using Deep Learning
Under the impact of global climate changes and human activities, harmful algae blooms (HABs) have become a growing concern due to negative impacts on water related industries, such as safe water supply for fish farmers in aquaculture. The current method of algae identification requires water samples to be sent to a human expert to identify and count all the organisms. Typically this process takes about 1 week, as the water sample must be preserved and then shipped to a lab for analysis. Once at the lab, a human expert must manually look through a microscope, which is both time-consuming and prone to human error. Therefore reliable and cost effective methods of quantifying the type and concentration of algae cells has become critical for ensuring successful water management. Blue Lion Labs, in partnership with the University of Waterloo, is building an innovative system to automatically classify multiple types of algae in-situ and in real-time by using a custom imaging system and deep learning. This will be accomplished using two main steps. First, the technical team will work with an in-house algae expert (a phycologist) to build up a labelled database of images. Second, the research team will design and build a novel neural network architecture to segment and classify the images generated by the imaging system. The result of this proposed research will dramatically reduce the analysis time of a water sample from weeks to hours, and therefore will enable fish farmers to better manage harmful algae blooms.
Industry Partner(s): Blue Lion Labs
Academic Institution: University of Waterloo
Academic Researcher: Wong, Alexander
Focus Areas: AI, Environment & Climate


Cloud Native Big Data Engineering and Automation
XLScout is a startup engaged in democratizing Innovation and connecting research and development with intellectual property (IP) departments across the world. The company is investing in developing proprietary algorithms, using Artificial Intelligence and Machine learning, to mimic the behaviour of an expert searcher. XLScout focus is on creating a sustainable and adaptive text mining framework that will provide natural Language Processing (NLP)based research outputs for IP in different domains. Various data points have been secured for our solution and they include patent data, litigation data, corporate data, reassignment data, examination data, and other patient-related data. XLScout hosts a data vault of about 130+ million patent documents which occupies approximately 8TB of storage. Document searching, in general ,is a cumbersome process, and specifically, searching the patent-related documents requires advanced strategies that a novice searcher might not be aware of. Therefore, this type of search requires extensive effort and time. The main objective of this project is to automate the patent and non-patent search effectively by allowing machines to understand users’ queries and thus, creating a sustainable and adaptive text mining framework that will provide NLP-based research outputs for IP search in different domains. Furthermore, this project aims to develop scalable solutions for XLScout’s data vault on which the company will run proprietary AI and ML models and generate high-value analytic solutions to help the customers make informed decisions.
Industry Partner(s): XL Scout
Academic Institution: Western University
Academic Researcher: Grolinger, Katarina
Focus Areas: AI, Business Analytics


Combination of optical imaging and deep learning for better seizure localization to improve epilepsy surgical and Deep Brain Stimulation Therapies
Epilepsy is a debilitating neurological condition defined by recurrent, unprovoked seizures characterized by hypersynchronous neuronal discharge. Although anti-epileptic drugs have shown effectiveness in treating epileptic seizures, approximately 30% of patients are medically refractory and not responsive to them.
This project will use computational strategies based on Neurescence’s proprietary hardware in combination with machine learning to extract a clinically relevant biomarker to support surgical planning, as well as to accurately determine seizure onset time to optimize DBS.
Industry Partner(s): Neurescence Inc.
Academic Institution: Carleton University
Academic Researcher: Joslin, Chris
Platform: GPU




Industry Partner(s): Lytica Inc.
Academic Institution: University of Ottawa
Academic Researcher: Burak Kantarci
Focus Areas: Advanced Manufacturing, AI, Business Analytics, Supply Chain


Deep Learning with Big Data for Innovation Acceleration
Presently, XLScout hosts data of over 130 million patents and 200+ million research publications occupying approximately 8TB of storage. Searching such a massive database using basic, mostly keyword-based search is very cumbersome and time-consuming. Moreover, it requires domain-specific knowledge about patents. With this project, XLScout aims to alleviate the pain of searching this massive system by employing machine learning (ML) and natural language processing(NL)techniques.Text autocompletion and recommendation will provide a better and smarter way for the end-users to search thismassive database. The auto-completion will be based on the corpus of patent documents and research publication to provide suggestions ofrelevant content. The MLmodel will be trained on that massive corpus taking into consideration language semantics. Techniques such as BERT and GPT 2/3 will be considered together with various pre-processing techniques. The size of the document database, document diversity, together with the subjectivity of desired results will make it challenging to evaluate such a system. We will employ both human-centric and automated evaluation approaches.Document categorization will also be based on the semantics, and it will group documents into labeled categories. Unsupervised techniques will be examined in their ability to do this categorization. However, different companies, XLScout clients, have different preferences in respect to this categorization. Therefore, an approach will be developed for the end-users to express their preferences by providing sample categories. Then, the model will learn from those preferences and carry out categorization. The challenge in this categorization is to enable unsupervised categorizationwhile supporting semi-supervision and customization. This categorization will be client-company specific and the modelwill have to learn from a limited number of example classes identified by the end-user.
Industry Partner(s): XLScout Ltd.
Academic Institution: Western University
Academic Researcher: Katarina Grolinger
Focus Areas: AI, Business Analytics


Design and Development of Autonomous Disinfecting Embedded Systems for COVID-19
One of the major challenges during the COVID-19 pandemic is frequent disinfecting. This is very critical for places like hospitals and long-term care. In most places, human operators perform the cleaning but it may cause them to be infected with the virus because of the shortages of personal protective equipment (PPE) and many of the unknowns of COVID-19. The aim of thisCOVID-19 project is to improve Cyberworks Robotics’ navigation technology on existing (a) floor disinfection machines (e.g. wet floor scrubbers) of various types used in hospitals, (b) high-intensity UV disinfection machines, and (c) chemical mist disinfection machines. This would allow hospitals to disinfect the hospital surfaces on a more frequent basis than is possible with human cleaners (due to both the cost and availability of human operators) and also simultaneously to increase the quality of cleaning by ensuring that some surfaces are not missed due to human error and neglect
Industry Partner(s): Cyberworks Robotics
Academic Institution: Ontario Tech University
Academic Researcher: Azim, Akramul
Platform: Cloud




Hybrid quantum-classical simulation and optimization platform for industrials
Predictable power is valuable power. This is because the intermittent nature of renewables such as wind energy has posed several challenges for planning the daily operation of the electric grid. As their power fluctuates over multiple time scales, the grid operator has to adjust its day-ahead, hour-ahead, and real-time operating procedure. In the absence of cheap and efficient energy storage to make up for sudden power generation shortfalls or excesses on the grid caused by renewables, nowadays, the grid operator sends a signal to power plants approximately every four seconds to ensure a balance between total electric supply and demand. To be able to strengthen the business case for wind power and further drive the adoption of carbon-free energy on electric grids worldwide, novel strategies are required to stabilize the balance of the grid as the number of renewable generators connected to the grid increases.
Our main goal in this collaboration is to design novel models and algorithms that improve an hour or day-ahead renewable energy prediction. As a first step, we are creating a novel weather simulation model using knowledge-guided or Physics Informed Machine Learning (PIML) methods, and machine learning accelerated computational fluid dynamics (CFD) techniques. Furthermore, in parallel, we will be exploring the suitability of current and near-term quantum computers as co-processors to help address the need for high computational power for simulating complex systems such as wind.
Industry Partner(s): Foreqast
Academic Institution: University of Waterloo
Academic Researcher: Kempf, Achim
Platform: Cloud, GPU, Parallel CPU
Focus Areas: AI, Clean Tech, Energy, Environment & Climate




Machine Learning Methods for Behavioural Biometrics
The project will focus on creating a Data architecture that will be composed of models, policies, rules and standards that will monitor which data is collected, and how it is stored, arranged and integrated into the system. The task of the professor and the student will be to create a data architecture and a machine learning algorithm that will help build a robust user-profile system that will help extract, store, build and analyse up to 1 million user profiles. This profile system will also be in charge of generating at least 50 behavioral data or 64 bytes of data per second. Simultaneously, the given data architecture will provide over 5 million user-profile recognitions per day through the use of predictive modeling and the given REST API call. This will help detect suspicious activities and anomalies without the use of browser cookies, location, and hardware information. For this project, the professor and the student will be given access to a part of F8th’s repository and must work together as a team to build a user-profile system that will help extract the given user profiles. This data architecture will help tolerate the behavioral data noise caused by the modification of input devices such as a mouse, keyboard, mobile and laptop. As F8th IDaaS is an Identity as a Service Solution (as described above), it is necessary to provide predictions on the given trained models (also known as behavioral biometrics). The market requires the predictions of the models to be done in less than 1 second (something that will be discussed later on when the project is being worked on by the professor and the student). The service must be provided at a reasonable cost and time; hence, it is very important to monitor the data as well as the resource consumption continuously. Lastly, as the project is to be worked on between the professor and the student as a team-constant discussions, comparisons, and decisions regarding the quality and the affordability of the solution will have to be done by the student and the professor.
Industry Partner(s): F8th AI
Academic Institution: Western University
Academic Researcher: Ouda, Abdelkader
Platform: Cloud
Focus Areas: AI, Cybersecurity, Health, ICT


Machine Learning Methods for Behavioural Biometrics
The project will focus on creating a Data architecture that will be composed of models, policies, rules and standards that will monitor which data is collected, and how it is stored, arranged and integrated into the system. The task of the professor and the student will be to create a data architecture and a machine learning algorithm that will help build a robust user-profile system that will help extract, store, build and analyze up to 1 million user profiles. This profile system will also be in charge of generating at least 50 behavioural data or 64 bytes of data per second. Simultaneously, the given data architecture will provide over 5 million user-profile recognitions per day through the use of predictive modelling and the given REST API call. This will help detect suspicious activities and anomalies without the use of browser cookies, location, and hardware information. For this project, the professor and the student will be given access to a part of F8th’s repository and must work together as a team to build a user-profile system that will help extract the given user profiles. This data architecture will help tolerate the behavioural data noise caused by the modification of input devices such as a mouse, keyboard, mobile and laptop. As F8th IDaaS is an Identity as a Service Solution (as described above), it is necessary to provide predictions on the given trained models (also known as behavioural biometrics). The market requires the predictions of the models to be done in less than 1 second (something that will be discussed later on when the project is being worked on by the professor and the student). The service must be provided at a reasonable cost and time; hence, it is very important to monitor the data as well as the resource consumption continuously. Lastly, as the project is to be worked on between the professor and the student as team-constant discussions, comparisons, and decisions regarding the quality and the affordability of the solution will have to be done by the student and the professor.
Industry Partner(s): F8th AI
Academic Institution: Western University
Academic Researcher: Abdelkader Ouda
Platform: Cloud
Focus Areas: AI, Cybersecurity

Optimization of electrochemical carbon dioxide conversion using machine learning and finite element computational approach
This project will explore deep neural networks and random forest regressors to predict system outputs of optimizing over 100 parameters and a chemical space of size ~10^10 to maximize production rate and process efficiency. In addition to the machine learning-based computational approach, the finite element computational simulations improve electrolyzer designing. This project builds upon a Mitacs project and will make use of data collected during XPRIZE operations. This project will contribute towards the optimization of CERT’s cell design for future operations.
Industry Partner(s): CERT Systems
Academic Institution: University of Toronto
Academic Researcher: Sargent, Ted
Platform: GPU, Parallel CPU
Focus Areas: AI


Real-time food analysis using deep learning for Diabetes Self -Monitoring
Diabetes is the 6th leading cause of death in Canada, and affects over 3 million people; daily this number grows by 549 new cases. Diabetes patients often manage their condition via a combination of food intake and insulin. This method can cause blood glucose levels to fluctuate due to uncertainty in food portions consumed and can lead to severe illness. At Glucose Vision, our goal is to create a smartphone application capable of pre-evaluating diabetes patients’ meals before they consume themwith the snapof a picture. We are attempting to accomplish this goal by employing AI, machine learning as well as computer vision into our smartphone application. The first part of this project will be to assemble a dataset of foods tagged with nutritional informationthat can be used to train an image recognition algorithm. This will be the backend of the application allowing users to snap a photo of their meal and be displayed nutritional information such as a carbohydrate count based on their respective serving size. We will also require machine learning algorithms to make the meal information on specific foods such as homemade dishes more accurate and more personalized to the patient. By developing a model that uses these technologies, we believe we can create a smartphone application that will revolutionize how diabetes patients manage their condition and allow users to maintain consistent and healthier blood sugar levels.
Industry Partner(s): Glucose Vision
Academic Institution: Ryerson University
Academic Researcher: Khan, Naimul


Speech enhancement and recognition with generative adversarial network
This project will use a generative adversarial network (GAN) to learn the noise patterns and improve the intelligibility and quality of speech, so that the speech recognition task can have enhanced audio to improve the word accuracy. The recent breakthrough in speech enhancement is leveraging GAN to denoise raw audio. The research wants to implement, and experiment with various GANs (SEGAN, WGAN, SEGAN-Aco, SEGAN-PTAco, etc.), where stacked deep convolutional layers are used. These GANs require hundreds of epochs to generate satisfying results with hundreds of hours of raw audio files and different combinations of noise patterns. In this case, sufficient computing resources will boost the training process, which can provide the researcher with more time to evaluate the performances of models and test on novel structures.
Industry Partner(s): Pearson Inc.
Academic Institution: University of Toronto
Academic Researcher: Penn, Gerald
Platform: GPU
Focus Areas: AI, Digital Media



Thermal imaging for efficient detection of vital signs during COVID-19 pandemic
Early detection of symptoms of COVID-19 infection is of utmost importance. In this project, we are focusing on early detection using thermal imaging cameras that allow for non-contact, non-invasive monitoring of temperature, heart rate and breathing rate. We propose to develop two solutions: 1. to detect early symptoms by detecting an increase in people’s temperature through crowd sensing and 2. to monitor the vital signs of elderly patients continuously using thermal cameras next to the places where they spend most of their time. These solutions will be based on advanced signal processing algorithms and machine learning models.
At the end of the project, we will provide a stand-alone solution that will include a thermal camera, processing hardware and our software and algorithms. This will represent a minimum viable product that will be taken by our industrial partner J&M Group and further developed into a commercial product.
This project is important for Canada because it addresses early detection of COVID-19 of both people who are active and might not know that they have COVID-19 symptoms as well as elderly people who are at their homes or retirement homes and require constant monitoring of their vital non-invasively.
Industry Partner(s): J&M Group
Academic Institution: University of Ottawa
Academic Researcher: Bolic, Miodrag
Platform: GPU



Unilever Canada Collective Intelligence Platform
We propose to develop a demand forecasting pipeline that collects data and applies machine learning models to make accurate predictions of future demands under unstable scenarios. In particular, this project will focus on efficient forecast of Unilever’s base SKUs and involve model development based on time-series, machine learning and possibly combinations of the two, data engineering to identify critical features hinting future trends, and deployment of the models to the existing business process. Methodologies that will be used to support Collective Intelligence (CI) decision-making platform include time series methods like vector autoregression and vector error correcting model; machine learning methods like random forest and deep learning etc. The Machine learning-based demand forecasting for supply chain optimization method should be able to deliver on each component and seamlessly integrate applications to the main platform that runs in a Microsoft Azure environment.
Industry Partner(s): Unilever Canada
Academic Institution: The University of Toronto
Academic Researcher: Lee, Chi-Guhn
Platform: GPU, Parallel CPU
Focus Areas: AI, Business Analytics, Supply Chain