It all boils down to what task your model should learn to do. You just need to choose one loss function for your model to minimize, right?
Well, it turns out you can train your model to learn multiple tasks at once. If you do that, most chances are your model will generalize better!
In this talk you'll understand why introducing multiple tasks to your Deep Neural Network is helpful;
You'll experience concrete examples of when it's effective to do so;
And most importantly, you'll learn how to implement it.