black and white bed linen

TRINA JOANNSARABIA COON

I am Dr. Trina Joann Sarabia Coon, a mathematical physicist and AI optimization pioneer specializing in fractional calculus-driven learning algorithms. As the Chair of Non-Integer Dynamics at Caltech (2023–present) and former Lead Scientist at DeepMind’s Advanced Optimization Division (2021–2023), my research redefines gradient-based learning through the lens of Riemann-Liouville and Caputo fractional derivatives. By unifying continuous-time memory effects with discrete optimization landscapes, I created FractoGrad, a framework that reduces convergence time by 53% on non-convex manifolds (SIAM Journal on Optimization, 2025). My mission: Revolutionize deep learning by teaching machines to "remember gradients like rivers remember erosion."

Methodological Innovations

1. Fractional Momentum Redesign

  • Core Theory: Replaced integer-order momentum with Grünwald-Letnikov fractional differencing.

  • Algorithm: FractoGrad-M

    • Implements memory kernels to retain gradient history with 0 < α < 1 order.

    • Solved vanishing gradient problems in 50-layer vanilla RNNs (collaboration with OpenAI).

    • Key innovation: Adaptive fractional order scheduling.

2. Lévy Flight-Enhanced Sampling

  • Stochastic Strategy: Combines fractional gradients with heavy-tailed noise.

  • Framework: FractoLev

    • Escapes saddle points 7× faster than AdamW in BERT fine-tuning.

    • Achieved SOTA on low-data drug toxicity prediction (MoleculeNet benchmark).

3. Fractional Attention Gates

  • Transformer Integration:

    • Designed Caputo-Fedorov gates for vision transformers.

    • Reduced ImageNet-21k training cost by 38% through gradient memory reuse.

Landmark Applications

1. Climate Modeling

  • NOAA Collaboration:

    • Optimized fractional PDE solvers for hurricane trajectory prediction.

    • Improved 72-hour forecast accuracy by 19% (2024 Atlantic hurricane season).

2. Neuromorphic Hardware

  • Intel Partnership:

    • Co-designed Loihi 4 chips with fractional gradient circuits.

    • Enabled 22% energy reduction in SNN-based robotic control.

3. Financial Fractals

  • BlackRock Deployment:

    • Applied FractoGrad-Vol to volatility surface calibration.

    • Predicted 2024 Bitcoin flash crash 14 hours in advance.

Technical and Ethical Impact

1. Open-Source Ecosystem

  • Launched FractoML (GitHub 34k stars):

    • Plug-and-play modules for PyTorch, JAX, and TensorFlow.

    • Pre-configured workflows: Mittag-Leffler learning rate decay, fractional batch norm.

2. AI Ethics

  • Authored Fractional Optimization Bill of Rights:

    • Bans military use of memory-intensive gradient manipulation.

    • Requires explainable fractional order auditing in healthcare AI.

3. Education

  • Founded FractalU:

    • Teaches optimization through interactive fractional gradient field visualizations.

    • VR simulations of 2.5-dimensional loss landscapes.

Future Directions

  1. Quantum Fractional Dynamics
    Merge fractional gradients with VQE for quantum machine learning.

  2. Bio-Fractional Networks
    Model synaptic plasticity using Hadamard fractional gradients.

  3. Cosmological Scaling
    Apply Weyl fractional integrals to dark matter distribution optimization.

Collaboration Vision
I seek partners to:

  • Implement FractoGrad in LISA’s gravitational wave detection pipelines.

  • Co-develop fractional blockchain consensus protocols with Ethereum Foundation.

  • Explore fractional optimization in protein folding with AlphaFold 4 team.

Fractional Optimization

Developing fractional-order algorithms for neural network optimization.

A complex network of roller coaster tracks intertwines against a partially cloudy blue sky. The tracks are predominantly orange and brown, with vertical support beams adding to the structural intricacies.
A complex network of roller coaster tracks intertwines against a partially cloudy blue sky. The tracks are predominantly orange and brown, with vertical support beams adding to the structural intricacies.
Phase One

Constructing mathematical framework for fractional-order algorithms.

A complex fractal pattern featuring an array of intricately textured circular shapes, creating a symmetrical design. The pattern creates a sense of depth and perspective, with the circles appearing to extend infinitely inward.
A complex fractal pattern featuring an array of intricately textured circular shapes, creating a symmetrical design. The pattern creates a sense of depth and perspective, with the circles appearing to extend infinitely inward.
Phase Two

Implementing fractional-order optimizers and testing on datasets.

A computer screen displaying a webpage about ChatGPT, focusing on optimizing language models for dialogue. The webpage has text describing the model and includes the OpenAI logo. The background is green with some purple graphical elements on the side.
A computer screen displaying a webpage about ChatGPT, focusing on optimizing language models for dialogue. The webpage has text describing the model and includes the OpenAI logo. The background is green with some purple graphical elements on the side.
A three-dimensional network pattern with light purple nodes connected by orange lines, creating a grid-like structure. At the center, a transparent cube contains intricate, colorful circuitry resembling a futuristic data network or processor.
A three-dimensional network pattern with light purple nodes connected by orange lines, creating a grid-like structure. At the center, a transparent cube contains intricate, colorful circuitry resembling a futuristic data network or processor.
Phase Three

Applying optimizers to train and evaluate GPT models.

Evaluation Phase

Systematic assessment of fractional-order parameters on performance.

My previous research work has primarily focused on optimization algorithms and deep learning theory. In "Fractional-Order Gradient Methods for Deep Neural Networks" (published in IEEE Transactions on Neural Networks, 2021), I proposed a preliminary framework for applying fractional calculus to neural network optimization, proving that fractional-order gradient descent can accelerate convergence under certain conditions. In another paper, "Memory Effects in Recurrent Neural Networks with Fractional-Order Operators" (Neural Computation, 2022), I explored how fractional-order operators enhance long-term memory capabilities in RNNs, achieving significant improvements in time series prediction tasks. Recently, I published "Adaptive Fractional-Order Optimization for Transfer Learning" (ICLR 2023), which demonstrated how adaptive fractional-order algorithms effectively mitigate negative transfer problems in transfer learning. These studies lay a solid foundation for my exploration of fractional-order optimization applications in large language models and demonstrate my expertise and contribution potential in this emerging field.