


Achieving Circular Wastewater Management with Machine Learning
Effective wastewater treatment is essential to the health of the environment and municipal wastewater treatment plants in Canada are required to achieve specific effluent water quality goals to minimize the impact of human-generated wastewater on the surrounding environment. Most wastewater treatment plants include a combination of physical, chemical, and biological unit processes and therefore have several energy inputs to drive mixing, maintain ideal temperatures, and move water from one unit process to the next. Methane and other gases (biogas) and biosolids are generated during wastewater treatment. Both of these can be captured and repurposed for use within and outside of the wastewater treatment plant and can in some cases even be converted to revenue streams. Thus, biogas and biosolids are considered recoverable resources rather than waste products. Circular wastewater management (CWM) is an emerging approach that aims to optimize wastewater treatment, energy usage, and resource recovery. To achieve CWM, the operators of wastewater treatment plants must have a thorough understanding and reliable control of the different elements of the system. This is usually achieved using a combination of operator expertise, online sensors, and offline water quality measurements coupled with data collection, storage, and analysis software. Conventional statistical approaches have traditionally been used to analyze the data generated in wastewater treatment plans, be these approaches are not flexible enough to fully describe and control the complex chemical and biological processes underlying wastewater treatment or to help utilities achieve CWM. Machine learning approaches are more flexible than conventional statistical approaches. In this project, we will use machine learning tools to identify opportunities to move towards circular wastewater management in a wastewater treatment plant operated by the Ontario Clean Water Agency.
Industry Partner(s): Ontario Clean Water Agency
Academic Institution: York University
Academic Researcher: Stephanie L. Gora
Co-PI Names: Usman T. Khan, Satinder K. Brar
Platform: Cloud
Focus Areas: Cities, Clean Tech, Environment & Climate


An Integrated Risk Assessment Framework for Compound Flooding in Canadian Urban Environments
The simultaneous or subsequent occurrence different flood drivers including heavy rainfall, river overflow, storm tides, among others threaten Canadian communities and infrastructure especially in coastal environments. Analysis of drivers of flooding in isolation without proper characterization of their interrelationships can lead to a significant underestimation of flood risk. This can severely undermine resilience measures and lead to the misallocation of investment in flood protection. In this project, we will develop an integrated statistical and physically-based modelling framework to simulate and predict compound flood risks under climate change in Canadian coastal zones to develop effective mitigation plans. The proposed approach will quantifythe dependencies between multiple flood hazards and identify/characterize compound events using a novel multivariate statistical approach. We will set up and calibrate a land surface modelcoupled to a hydrodynamic model to characterize the multivariate behaviour of flooding. The resulting framework will assess the impacts of compound flooding and characterize the contribution of each driver to the impacted areas under future scenarios considering the effects of more intense hydroclimatic events and sea-level rise in a changing climate. In collaboration with the Institute for Catastrophic Loss Reduction (ICLR), we will disseminate the results from the proposed project directly to insurance industry involved in the management of urban flood risks. This project will provide river forecast centres, conservation authorities, insurance industries, municipalities, among other stakeholders with data, rigorous methodology, and personnel to analyze and predict compound flooding and will significantly contribute to improved resilience of Canadian cities and communities.
Industry Partner(s): Institute for Catastrophic Loss Reduction
Academic Institution: Western University
Academic Researcher: Reza Najafi, Mohammad
Platform: Parallel CPU
Focus Areas: Clean Tech, Environment & Climate


Assessment of Prototype Scour Data
Scour is the removal of riverbed sediment which can be induced due to the presence of channel contraction, hydraulic structures, natural fluctuations in discharge, or changes in sediment supply in a fluvial environment. Scour and erosion have been identified as the primary cause of the majority of bridge failures in North America. Several investigators have concluded that about 50% of all bridge collapses occur due to scour-related complications. The prevalence of scour-induced bridge collapses is indicative of the criticality of scour estimation in the interest of public safety as well as mitigation of infrastructure costs, as this type of collapse often requires significant investment for design and construction of replacement bridges, fault analysis and potential rehabilitation. The available bridge design codes are mostly extracted from estimates of scour at the laboratory scale (experiments in reduced-scale physical models), acquired under highly controlled conditions. Some sources of model inaccuracy include scale effects in physical hydraulic modelling; a lack of understanding of the flow physics of the phenomenon; and the limitations of current computational methods used to model sediment transport. A way to improve the prediction ability of current scour methodologies could be using observed scour values in real rivers to verify/correct such methods, as this proposed research intends to do. The aim of the present project is to study prototype scour data at various field site as well as to use computational tools to assess the efficacy of scour estimation methods for investigating the possible improvements to existing approaches.
Industry Partner(s): Northwest Hydraulics Consultants
Academic Institution: Windsor University
Academic Researcher: Balachandar, Ram
Platform: Parallel CPU
Focus Areas: Clean Tech, Environment & Climate





Automated Assessment of Forest Fire Hazards
The proposed project is to develop a Deep Reinforcement Learning (DRL) model capable of using Light Detection and Ranging (LiDAR) data to automatically analyze whether there is a risk of forest fire due to vegetation management around electric equipment. The model automatically classifies points in a LiDAR scan to determine whether there is direct contact between vegetation and electric equipment and outputs control actions for an Unmanned Aerial Vehicle (UAV) that can autonomously inspect infrastructure as needed. Our research will focus on the solution of two different but complementary machine learning problems that the DRL model will have to solve to allow for a UAV to autonomously acquire data with sufficient quality for fire risk assessment. Firstly, there is the problem of perception, i.e., to analyze raw LiDAR data and deduce classifications of individual objects and their shape and orientation. Secondly, there is the problem of control, i.e., to use the perceived state of the environment as determined by the first stage to adequately navigate through the environment until a complete scan has been obtained of the region that we are conducting a fire risk assessment for. Solving these two problems in a harmonious way that allows for an efficient, accurate and a complete scan of the environment is an open research problem in the domain of DRL that we aim to address.
Industry Partner(s): Metsco Holding Inc.
Academic Institution: University of Guelph
Academic Researcher: Lei Lei
Platform: Cloud, GPU, Parallel CPU
Focus Areas: Business Analytics, Clean Tech, Energy, Environment & Climate, Mining


Automatic Identification of Phytoplankton using Deep Learning
Under the impact of global climate changes and human activities, harmful algae blooms (HABs) have become a growing concern due to negative impacts on water related industries, such as safe water supply for fish farmers in aquaculture. The current method of algae identification requires water samples to be sent to a human expert to identify and count all the organisms. Typically this process takes about 1 week, as the water sample must be preserved and then shipped to a lab for analysis. Once at the lab, a human expert must manually look through a microscope, which is both time-consuming and prone to human error. Therefore reliable and cost effective methods of quantifying the type and concentration of algae cells has become critical for ensuring successful water management. Blue Lion Labs, in partnership with the University of Waterloo, is building an innovative system to automatically classify multiple types of algae in-situ and in real-time by using a custom imaging system and deep learning. This will be accomplished using two main steps. First, the technical team will work with an in-house algae expert (a phycologist) to build up a labelled database of images. Second, the research team will design and build a novel neural network architecture to segment and classify the images generated by the imaging system. The result of this proposed research will dramatically reduce the analysis time of a water sample from weeks to hours, and therefore will enable fish farmers to better manage harmful algae blooms.
Industry Partner(s): Blue Lion Labs
Academic Institution: University of Waterloo
Academic Researcher: Wong, Alexander
Focus Areas: AI, Environment & Climate





Characterization and Optimization of Heat Transfer from GRIP Metal Enhanced Surfaces
NUCAP specializes in brake pads and while improving their manufacturing process, they developed an innovative way to enhance the surface of metals in the form of small metal hooks (GRIP Metal). The raised features offer both increased surface and increased turbulence and flow mixing which can also greatly enhance convective heat transfer. The main objective of this project is to develop models which can accurately predict convective heat transfer and associated pressure drop for GRIP Metal enhanced surfaces. These will ultimately be used to optimize GRIP Metal surfaces for a wide range of heat exchange applications.
Industry Partner(s): NUCAP Industries
Academic Institution: York University
Academic Researcher: Roger Kempers
Platform: Parallel CPU
Focus Areas: Advanced Manufacturing, Clean Tech, Energy, Environment & Climate, Mining

Cold-region Land-surface Simulations under Climate Change
This proposal is part of a larger modelling project, Canada 1 Water, for which separate RAC applications have been submitted. This request is in direct support of a Mitacs application and concerns a subset of simulations that will be performed by or will directly involve the Mitacs interns. It is anticipated that two 150-year climate projection runs using CRCM5 forcing data from the CORDEXarchive, two 50-year historical reanalysis simulations (using ERA5 and NRCan climate forcing) will be necessary, as well as 100 model-years for development, testing and spin-up, totalling 500 model-years.The model that will be used for this project is the Community Terrestrial Systems Model, which is the stand-alone version of CLM5, the land component of the Community Earth System Model v2. One model year of CTSM/CLM5 simulation for Canada at 4km resolution requires approx. 0.2 core-years and 114 GB of storage. This estimate is based on lower-resolution test runs; scaling is trivial since vertical columns do not communicate.
The anticipated 500 model-years would require 100 core-years of compute time and produce 57 TB of output; however, much of the output could also be stored on a separate tape archive or off-site. 50 years of ERA5 climate forcing data would require approximately 20 TB of continuous scratch or project storage; NRCan data and topographic data would require 10 TB, and each 150-year CRCM5 projection would require approximately 16 TB of storage, resulting in 64 TB of forcing data. However, only 50% of that would be required at any given time, so 100 TB of scratch storage would be sufficient. Note that these are estimates and resolutions and the number of simulations can be adjusted as resources permit. Furthermore, the intention is, to make all data publicly available for research.
Industry Partner(s): Aquanty Inc
Academic Institution: The University of Waterloo
Academic Researcher: Fletcher, Christopher
Platform: Parallel CPU
Focus Areas: Environment & Climate




Hybrid quantum-classical simulation and optimization platform for industrials
Predictable power is valuable power. This is because the intermittent nature of renewables such as wind energy has posed several challenges for planning the daily operation of the electric grid. As their power fluctuates over multiple time scales, the grid operator has to adjust its day-ahead, hour-ahead, and real-time operating procedure. In the absence of cheap and efficient energy storage to make up for sudden power generation shortfalls or excesses on the grid caused by renewables, nowadays, the grid operator sends a signal to power plants approximately every four seconds to ensure a balance between total electric supply and demand. To be able to strengthen the business case for wind power and further drive the adoption of carbon-free energy on electric grids worldwide, novel strategies are required to stabilize the balance of the grid as the number of renewable generators connected to the grid increases.
Our main goal in this collaboration is to design novel models and algorithms that improve an hour or day-ahead renewable energy prediction. As a first step, we are creating a novel weather simulation model using knowledge-guided or Physics Informed Machine Learning (PIML) methods, and machine learning accelerated computational fluid dynamics (CFD) techniques. Furthermore, in parallel, we will be exploring the suitability of current and near-term quantum computers as co-processors to help address the need for high computational power for simulating complex systems such as wind.
Industry Partner(s): Foreqast
Academic Institution: University of Waterloo
Academic Researcher: Kempf, Achim
Platform: Cloud, GPU, Parallel CPU
Focus Areas: AI, Clean Tech, Energy, Environment & Climate

Real-time flood forecasts using parallel cloud computing and intelligent algorithms
Development of flood forecast early warning systems is critical as storm and spring melt events intensify due to climate change. Predicting these events with precision and adequate warning time presents a significant technological and computational challenge for managers of water resource systems.
Effective flood forecasting and warning systems combine several complex modelling tools. Climate forecast, hydrologic and hydraulic models are used together to forecast flows, water levels and flood inundation extents. Such models are computationally expensive, and each has associated uncertainties that must be quantified.
This project addresses these issues by coupling models with Machine Learning algorithms and high-performance computing infrastructures. The approach will enable the existing flood forecasting platform (known as ISWMS) to better i) quantify uncertainties associated with flood forecasts and ii) minimize run-time of the computationally expensive models embedded within the forecasting system.
Industry Partner(s): Greenland International Consulting Inc.
Academic Institution: University of Guelph
Academic Researcher: Prasad Daggupati
Platform: Cloud
Focus Areas: Environment & Climate




Sparse Representations for Embodied AI
Understanding scenes representing real-world environments is a challenging problem at the intersection of Computer Vision research and Deep Learning and a necessary pre-requisite for Embodied AI. Embodied AI is an emerging field within Machine Learning that focuses on the challenges that need to be addressed for the successful deployment of edge devices such as drones and robots. In this setting estimating the semantics of an environment plays an essential role in addition to how it can be efficiently navigated to solve a variety of tasks that can involve other agents as well. The Embodiment Hypothesis states that intelligence emerges from the interaction of an agent and its perception of the environment it is embodied within. This project establishes an empirical study to validate this hypothesis for Deep Reinforcement Learning (DRL) agents trained on environments derived from simulations as well as real-world data. We use DRL as a methodology to model a Partially Observable Markov Decision Process (POMDP) that describes optimal processes for navigation-perception problems. The embodiment hypothesis implies that an end-to-end understanding of this problem should outperform conventional methods for training computer vision models. To address the limitations of existing DRL algorithms on this task, we propose a novel family of networks that learn to solve embodied tasks from sparse representations of the perceived data. Our aim is to enable a new paradigm for efficient production systems for drones to navigate complex environments and visually monitor infrastructures such as for autonomous fire risk assessment and asset monitoring.
Industry Partner(s): EthicalAi Inc
Academic Institution: The University of Guelph
Academic Researcher: Lei, Lei
Focus Areas: 5G/NextGen Networks, Cities, Energy, Environment & Climate, Transportation




The Alliance for AI-Accelerated Materials Discovery (A3MD)
The Alliance for AI-Accelerated Materials Discovery (A3MD) is an industrial-academic initiative to leverage advances in AI and high-throughput experimentation toward accelerating the discovery of new materials for clean energy and electronics. We are usinghigh-performance computing to conduct new first-principles calculations of materials properties that, joint with existing available databases and our own experimentally-generated data, can be used to train machine learning models that can rapidly explore a vast chemical space for desired properties. We use robotics to rapidly synthesize and evaluate a large number of materials in parallel and use AI to learn from this experimental data to inform the next set of experiments. Our goal is to accelerate the rate at which we discover new high-performance materials by at minimum 100x.
Industry Partner(s): LG
Academic Institution: University of Toronto
Academic Researcher: Ted Sargent
Platform: GPU, Parallel CPU
Focus Areas: Advanced Manufacturing, Energy, Environment & Climate, Mining