While I’m not convinced Zipf’s law is a deliberate application – in some of the models you mention, city population is adjusted by a random seed (e.g., Hulings) – I agree that straight-up average is boring.

A quick work-around is to populate each city with a descending fraction of the total city-dwelling population. Start with the sum of ‘n’ integers, where ‘n’ equals the number of cities. Let’s say we have 8 cities, so ‘n’ = 36 *[ (8 * (8 + 1) / 2) = 36 ].* Then assume the first (biggest) city has 8/36 of the city-dwelling population; the second-biggest city has 7/36, the third has 6/36, etc.

Let’s say you have a total city-dwelling population of 90,000. For 8 cities, the populations as follows:

* City 1 –> 90,000 * (8/36) = 20,000

* City 2 –> 90,000 * (7/36) = 17,500

* City 3 –> 90,000 * (6/36) = 15,000

* City 4 –> 90,000 * (5/36) = 12,500

* City 5 –> 90,000 * (4/36) = 10,000

* City 6 –> 90,000 * (3/36) = 7,500

* City 7 –> 90,000 * (2/36) = 5,000

* City 8 –> 90,000 * (1/36) = 2,500

Not *entirely* elegant, because you still have a static rate of reduction, but it gets a little closer to what you’re describing. Hope this helps.

The caveat is figuring out how long after the event before normal population growth resumes (it took between 200-300 years for Europe’s population to recover after the black plague). By that metric (and avoiding more math than I think is necessary for an RPG setting), I’d only factor depopulating events within the last 250 years.

To give you an idea of how your setting might be affected by major depopulating events, check out the Wikis: https://en.wikipedia.org/wiki/Consequences_of_the_Black_Death

]]>