This op can be used to override the gradient for complicated functions. For
example, suppose y = f(x) and we wish to apply a custom function g for backprop
such that dx = g(dy). In Python,
withtf.get_default_graph().gradient_override_map({'IdentityN':'OverrideGradientWithG'}):y,_=identity_n([f(x),x])@tf.RegisterGradient('OverrideGradientWithG')defApplyG(op,dy,_):return[None,g(dy)]# Do not backprop to f(x).
Args
input
A list of Tensor objects.
name
A name for the operation (optional).
Returns
A list of Tensor objects. Has the same type as input.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-01-23 UTC."],[],[]]