
Students check out if getting 50% off and then a 20% discount will give them 70% off. Why does that happen?
Students use yearly percent increase data to decide which stock: Apple, Disney or Amazon, they should have invested in way back in 2015.
What was the net gain of these stocks?
How do you figure that?
Does the arithmetic or geometric average of those increases equal their total gain or loss over the years?