Search for the definition you are looking for.
Adjoint algorithmic differentiation (AAD)
Adjoint algorithmic differentiation is a mathematical technique used to significantly speed up the calculation of sensitivities of derivatives prices to underlying factors, called Greeks. It is widely used in the risk management of complex derivatives and valuation adjustments.
Greeks have traditionally been calculated by making small adjustments to the values of the inputs in the pricing of a derivative and calculating the output value each time – a process known as bumping. This can be time-consuming for a portfolio of thousands of trades because the valuation of each trade will involve many steps, each requiring the output from the previous step in order to proceed.
AAD breaks this valuation process into a number of steps that can be carried out simultaneously instead of sequentially. This is made possible by exploiting a key mathematical property that applies to sensitivities called the chain rule of differentiation, which links the derivatives of parts of a function to the derivative of the whole. This allows the backward propagation of sensitivities of the output with respect to the variables in the intermediate steps, until the sensitivities with respect to the inputs are achieved.
The technique has been shown to compute Greeks up to 1,000 times faster compared with the bumping method. Disadvantages of AAD include lengthy development time and the need for highly skilled quantitative programmers.
Click here for articles on adjoint algorithmic differentiation.