Instead of the letters A through F, why not have grades represented by a decimal from 0 to 5 (e.g. 3.7). This way, subtle gradations could be expressed, while the general quality of the grade could still be seen from its integer part. Thus anything in the 4’s is similar to an A in the current system, in the 3’s to a B, etc.
Why round the grades off to discrete values and lose information?
Also, in the interests of combating grade inflation, I would go so far as to calculate normalized grades, which would essentially represent the number of standard deviations of the regular grade from the mean for a given class. So a normalized grade of 1.5 would be one and a half standard deviations above the mean, while -0.8 would be 0.8 standard deviations below. Thus in a class where the professor gave all high grades in the 4.5 to 5 range, the better students could still be distinguished based on the normalized grade.
Just some thoughts: what am I missing?