Finding a standard deviation from a set of averages and standard deviations

Entrants to a school take an exam and their points are given the following importance for each respective area, with their average scores and standard deviations included. The averages included represent the scores of applicants admitted.

Biologia - 20% Avg. 68.68 SD 6.59

Filosofia - 5% Avg. 61.44 SD 7.63

Fisica - 10% Avg. 63.47 SD 10.0

Linea Interd. - 10% Avg. 59.12 SD 8.30

Lenguage - 15% Avg. 64.53 SD 8.30

Matematica - 20% Avg. 78.83 SD 11.30

Quimica - 10% Avg. 65.19 SD 6.49

Ciencias Sociales - 10% Avg. 60.12 SD 6.41

It's easy to find the average total score by multiplying each average by its percentage worth (i.e. biologia x .2) and adding the total. What I don't know how to do is find out the standard deviation of the average total score. In other words, what range of total scores represents the applicants who were admitted? Or with the given data is this not possible? Thanks so much