# optimizer-plugins **Repository Path**: facebookresearch/optimizer-plugins ## Basic Information - **Project Name**: optimizer-plugins - **Description**: A demonstration of source code transformation to implement automatic differentiation, compatible with an operation overload style AD library DiffKt. - **Primary Language**: Unknown - **License**: MIT - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2023-07-31 - **Last Updated**: 2023-07-31 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # AdOptimize Kotlin Plugin AdOptimize is a Kotlin compiler plugin that can transform functions written using an operation overload automatic differentiation library (OO AD) into a function over Kotlin Floats that returns the derivatives and primal value (SCT AD). It also includes a proof of concept feature that transforms functions over Kotlin floats into a function that returns the derivative. AdOptimize is not recommended for use in production; it is a research project. AdOptimize is unique in that the functions it produced may interoperate with functions written using the library. AdOptimize is also a proof of concept of the IR-serialize feature of the Kotlin compiler. When this feature is enabled, the Kotlin IR is serialized into the class files of a compiled JVM program, where it can be extracted by the compiler during the compilation of a consuming module. The benefit of this feature is that it enables a view of the entire program, which is a requirement for many optimizations including inlined derivative code. ### Features: - Removes the object wrapping that OO AD depends on via performing the differentiation via source code transformation. This is supported for reverse mode and a combination of forward and reverse mode (a variation of second order) over scalar functions. Only one active parameter is supported. (see `adOptimizeClient/src/main/kotlin/client/main.kt` for an example. The function `target` over DScalars is an example of this unwrapping. The annotation `@SecondOrderOptimize` and `@Optimize` indicate that both the first and second order derivatives shall be generated by AdOptimize) - single parameter reverse mode differentiation for Kotlin Floats (see `adOptimizeClient/src/main/kotlin/client/main.kt` for an example. The function `target` over Floats is an example of this prototype) - control flow ### From Source Usage - download or clone repository - publish the api locally using the gradle command `publishLocal` - Create a client project. Depend on DiffKt and create an ADConfig file that matches that of `adOptimizeClient/src/main/kotlin/AdConfig.kt`. The annotations can have custom names but if you change them you must also change them in the build script in the following step. - clone the gradle build from `adOptimizeClient/build.gradle` - Write a function using the @Optimize annotation and test the performance differences. See `adOptimizeClient/src/main/kotlin/client/main.kt` for an example. ### Testing - To run the AdOptimize and DiffPrep IR and codegen tests, run `./gradlew integrationTests`. - IR tests are programs that accept a text file of source code and a text file of expected serialized IR and verify that the source code, when compiled to Kotlin IR and serialized, matches the expected serialized IR file. These tests are useful during development to verify stubs have been generated correctly. It's common to produce the expectation file by coding up what you expect the compiler to generate by hand and serializing the hand written file. Note that if you delete the expectation file the test framework will generate a default one based on the source file and fail the test. - Codegen tests are programs that accept a text file of source code, where the source code contains a `box()` method. The test fails if the box method does not return "OK". Codegen tests are the primary form of testing in the compiler environment. ### Components **differentiable-api-preprocessor** The differentiable-api-preprocessor plugin (DiffPrep) is used in the compilation of the operation overloading library. The tests in AdOptimize rely on the operation overloading AD library DiffKt but there is no reason another implementation cannot be used. The OO AD library annotates key components of the library such as the differentiable scalar type and binds the fully qualified name of the annotation to a key defined in DiffPrep. DiffPrep will produce a configuration file for AdOptimize to consume during a client application's compilation. **producer-consumer** The Producer Consumer is an independent top level project that demonstrates the usage of a client application that uses AdOptimize but relies on a custom OO AD library. This is a good demonstration of the build script configration. **adOptimize** AdOptimize contains the AD generation code. The work flow is Kotlin IR is translated into DiffIR, a subset of Kotlin IR wrapped in classes that store pullback and active information for each statement. There are reverse and forward mode passes that accept DiffIR and return differentiated Kotlin IR. The intention was for the IR returned by each pass to be eligible for transformation back into Diff IR so that multiple differentiation passes could be applied, therefore supporting arbitrary orders of differentiation and combinations of reverse and forward. Only the Reverse-Forward combination was ever tested. **plugin-generators-common** In order to map to DiffIR, the Kotlin code must be lowered. A lowered language makes code generation easier because it reduces the number of constructs that must be supported. For example, we unnest all expressions so that every operation has a variable. The lowering in this module are used by both DiffPrep and AdOptimize. This module also includes extensions of Kotlin Compiler types so that functions and classes generated by the plugin are visible to consuming projects. See GeneratedAuthenticClassDescriptor.kt and ClassLifter.kt for details. **adOptimizeClient** The adOptimize client is an independent top level project that demonstrates how to use AdOptimize with DiffKt. Below is an illustration of the relationship between a client application, DiffPrep, and AdOptimize. ![plot](AdOptimize.png) ### Code Generation Example What does AdOptimize generate? Let us suppose we are given the following code: ```kotlin @Optimize fun target(a:DScalar, L:Int):DScalar { var i = 0 var s = a*a while(i() val intermediateValues = CodeGenStack() init { val a_primal = this.a.basePrimal().value val L = this.L var i = 0 var s = a_primal * a_primal while(i target_Reverse(a, L) else -> { var i = 0 var s = a*a while(i