Florida did not become a state until 1845. In the Treaty with Spain in 1819, the East and West Floridas would become the Territory of Florida under federal jurisdiction in 1821 until the "territory" became a "state" under the United States Constitution. Most of the meager population inhabited the extreme northern portion of the territory. [...]