Hollywood ending

Noun

 * 1) An outcome considered to be typical of certain movies produced in Hollywood, California, in which all desirable results are achieved, with protagonists being rewarded, antagonists being punished or destroyed, and positive sentiments (love, happiness, peace) prevailing over negative.