Skip to content

Taylor-mode automatic differentiation for higher-order derivatives

License

Notifications You must be signed in to change notification settings

avik-pal/TaylorDiff.jl

 
 

Repository files navigation

TaylorDiff.jl

Project Status: Active – The project has reached a stable, usable state and is being actively developed. License: MIT Stable Dev
Build Status Benchmark Status
ColPrac: Contributor's Guide on Collaborative Practices for Community Packages SciML Code Style

Open in GitHub Codespaces

TaylorDiff.jl is an automatic differentiation (AD) package for efficient and composable higher-order derivatives, implemented with operator-overloading on Taylor polynomials.

Disclaimer: this project is still in early alpha stage, and APIs can change any time in the future. Discussions and potential use cases are extremely welcome!

Features

TaylorDiff.jl is designed with the following goals in head:

  • Linear scaling with the order of differentiation (while naively composing first-order differentiation would result in exponential scaling)
  • Same performance with ForwardDiff.jl on first order and second order, so there is no penalty in drop-in replacement
  • Capable for calculating exact derivatives in physical models with ODEs and PDEs
  • Composable with other AD systems like Zygote.jl, so that the above models evaluated with TaylorDiff can be further optimized with gradient-based optimization techniques

TaylorDiff.jl is fast! See our dedicated benchmarks page for comparison with other packages in various tasks.

Installation

] add TaylorDiff

Usage

using TaylorDiff

x = 0.1
derivative(sin, x, 10) # scalar derivative
v, direction = [3.0, 4.0], [1.0, 0.0]
derivative(x -> sum(exp.(x)), v, direction, 2) # directional derivative

Please see our documentation for more details.

Related Projects

  • TaylorSeries.jl: a systematic treatment of Taylor polynomials in one and several variables, but its mutating and scalar code isn't great for speed and composability with other packages
  • ForwardDiff.jl: well-established and robust operator-overloading based forward-mode AD, where higher-order derivatives can be achieved by nesting first-order derivatives
  • Diffractor.jl: next-generation source-code transformation based forward-mode and reverse-mode AD, designed with support for higher-order derivatives in mind; but the higher-order functionality is currently only a proof-of-concept
  • jax.jet: an experimental (and unmaintained) implementation of Taylor-mode automatic differentiation in JAX, sharing the same underlying algorithm with this project

Citation

@software{tan2022taylordiff,
  author = {Tan, Songchen},
  title = {TaylorDiff.jl: Fast Higher-order Automatic Differentiation in Julia},
  year = {2022},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/JuliaDiff/TaylorDiff.jl}}
}

About

Taylor-mode automatic differentiation for higher-order derivatives

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Julia 100.0%