In the first two weeks of my internship, I’ve been getting familiar with my team’s hyperparameter optimization for groundwater level predictions and thinking about how I can create a physics guided neural network to improve our groundwater level predictions. I’ve been reading lots of scientific literature on the topic of physics-guided neural networks and have been taking notes on my findings and incorporating them into a PowerPoint presentation I’ll present at our team’s biweekly meeting next Tuesday. So far, I’ve decided that I want to utilize the soil moisture predictions of the process-based model, HYDRUS, as a feature in our groundwater level prediction neural network. Additionally, I would like to somehow emulate HYDRUS by developing a neural network to predict soil moisture that is less computationally expensive and hopefully makes better predictions. To do this, I’ll need to incorporate a physics constraint into the neural network’s loss function that makes sure that the model is not violating any physical laws.
The papers I’ve read so far have largely been on predicting lake temperatures and integrate predictions from the GLM (General Lake Model), a process-based model that predicts lake temperatures and enforces energy conservation constraints. I would like to do something similar but predict soil moisture with HYDRUS and use this as an input feature to our groundwater level neural network to potentially increase the accuracy of our current predictions. The energy constraints in the lake models ensure that the depth-density constraint is enforced, i.e. densities, which can be solved for from predicted temperature values, are higher at lower depths in the lake. In all of the studies, the physics-guided neural network outperformed the process-based and generic neural networks’ predictions. I haven’t determined what physical constraint I will implement yet, but we discussed conservation of energy as a possibility at our last meeting. Another idea I encountered in my readings was pretraining the physics-based neural network with weights from the process-based model. This was shown to improve predictions when smaller amounts of training data were available. I could possibly incorporate this by using the final neural network weights of HYDRUS as initial weights in my soil moisture neural network, using the output predictions from this as an input feature to our groundwater level neural network.
Some questions I’d like to explore are: can we use pretraining to increase the accuracy of our predictions when we have limited data, and does a physics-guided neural network for soil moisture outperform other models (specifically when training/ test data are dissimilar or when we use non site-specific data to train our model). I’m going to meet with some of the experts from our team this week to get familiar with HYDRUS and run my current ideas by them. I think one of the main challenges of my internship will be trying to implement scientific constraints in the machine learning model without much background on the topic. I’m excited to start implementing my ideas in code and add to my team’s current model, but I also anticipate that this might take a while to formulate. I need to meet more with our team members and get a good understanding of how HYDRUS works and start to develop a concrete plan for how I can incorporate physical constraints in our deep learning model.
On a less technical note, I mixed a margarita for our weekly virtual happy hour and got to meet everyone on the team. It was a fun introduction to the people at LBNL (Lawrence Berkeley National Lab) I’m working with, and one of the team members showed us his drawer full of golf balls he’d collected from the people who continually drive through his backyard on golf carts. While I was sad to find out that my internship would be fully remote due to COVID-19 safety precautions, I’m glad that we have the resources to continue. I’m in my Berkeley apartment for the rest of June and took a walk on campus last week to the Glade. It was nice to feel a connection to campus again, but there was definitely a feeling of surrealness seeing the campus so empty. I hope to be able to visit LBNL after quarantine to see where the research I’m working on is usually done and meet my team for some celebratory margaritas.