Phenomena occur at different rates. In data-driven optimization, real-life events are sampled to generate information that informs decision-making. Further, there could be interactions between these spatiotemporally disparate events. Models that are inclusive of constraints that bind decisions and the commodity streams they generate based on the information regarding the system, are consistent (balanced) across space, time, and optionality, and consider interactions between decisions taken at different dispositions are said to be multiscale. Modeled systems are either simulated or optimized. Modeling focuses on simulating a system, and its constituent operations, as a function of time and space as determined by the fidelity of information. The idea is to mimic real sytems 'mathematically'. The closer we are to reality better is the validity and relevance of our solution, and lesser the risk of being wrong. High-fidelity modeling uses dense parameter sets and have non-linear functions.
On the other hand, optimization is the act of determining the best solution to a problem sans exhaustive enumeration. There are algorithms to do this. A suite of algorithms form a solver. Solvers are typically experts at solving a general class of problems. For example, a mixed integer program involves either binary or (and) integer variables besides continuous variables, and thus needs algorithms that can branch through the decision tree and provide solutions efficiently. Needless to say, computational tractability for large and complex models is a challenge. While just the improvements in processing power itself has allowed modelers to tackle larger problems.. there is always the wall of computing cost. This is often surpassed by writing more efficient algorithms, surrogate modeling which involves solving a smaller or lower order problem that is indicative or scales to the larger problem. Surrogate modeling can involve clustering i.e. contracting time-series data into a smaller set of representative days, linearization e.g. piece-wise linearization, developing reduced order models as black boxes using neural networks and machine learning.
Now, since this is data-driven any error in estimating samples is carried into the solution; this we want to avoid. A simple way to hedge against (be robust to) risk this uncertainty invites is to take a more conservative solution. Conservative solutions can have trade-offs; Essentially, conservativeness often comes at a cost. The quantification of this price tag is analogous to ascertaining the sensitivity of the solution to the parameter uncertainty. This can be done by comparing discrete scenarios or by treating parameters as variables which allows us to obtain the solution as an explicit function of the uncertainty in the sample. The latter is called multiparametric programming which, while computationally more expensive, provides the entire solution in one go. There can also be trade-offs between different objective criteria. Quality, measured say as the time a product lasts, for example may come at a cost. Quantifying this allows decisions to be taken with a more potent understanding of the system. The field that deals with all of this is called process systems engineering (PSE).
So, where do I come in? I learn about, develop, and apply parts of all of the above to energy systems. I try to package my work as open-source resources for decision-makers to take informed decisions. Through all of this, I hope to aid in making systems more clean, reliable, affordable, and most importantly accessible. In PSE terminology, energy systems has grown to basically include.. well.. anything and everything. Though different domain arguably provide bespoke challenges. So far, I have looked at things such as chemical dense energy carrier production (hydrogen for example), power generation, manufacturing supply chain, material supply chains. Note that we do not look at these in isolation, subsystems are studied holistically in the confines of large-scale networks which allows us to model 'dependence', i.e. how they influence each other. My work helps design future energy systems, perform lifecycle analysis and impact accounting, generate robust operational schedules, understand the role of, and challenges with respect to, materials in enabling a technology transition, design systems to be more resilient to disruptions, amongst other very cool things.
The computational framework my PhD led to is called Energia which does some of what has been discussed here. I also developed Gana which is, put simply, a 'writer' to formulate mathematical programs (scripts). If you still have questions regarding what I do, reach out!