I am a computer programmer and work for a multi-level marketing company and am creating an application that will help Sellers to predict whether they will promote to the next level.
I have been stuck on the following problem, and am hoping some algebra-wizzes out there can assist me.
Here is a real example of what has happened to a Seller in the past.
BUSINESS RULE: A Seller can only use a total of 128000 maximum points from each DownLine over the 3 month qualification period.
Note: This Seller has 2 Downlines, so 12800 x 2 = 24800 points total can be used over 3 months. There are enough points available, the trick is to determine how many points from each Downline to use each month so that the maximum points rule (above) for each Downline is not broken. Even if glancing at this, you can tell me the answer, what I need is an algebraic formula that I can use with different numbers for different Sellers to derive the answer. For instance: I know intuitively, the following (which is the second bold group near the bottom of this email), but I want to be able to derive it via an algorithm:
Month1TotalDownline1PointsUsedToReachNEEDED: ???? // total Month1 = 110360
Month1TotalDownline2PointsUsedToReachNEEDED: ???? // total Month1 = 110360
Month2TotalDownline1PointsUsedToReachNEEDED: ???? // total Month1 = 0
Month2TotalDownline2PointsUsedToReachNEEDED: ???? // total Month1 = 0
Month3TotalDownline1PointsUsedToReachNEEDED: ???? // total Month1 = 113320
Month3TotalDownline2PointsUsedToReachNEEDED: ???? // total Month1 = 113320
Thank you so much for whatever help any of you are able to give me to assist me in solving this problem and improving my algebra so I am better able to solve problems like this and assist others with them in the future!
PS. Please let me know if I should be posting this in a different forum.